US20110098083A1 - Large, Ultra-Thin And Ultra-Light Connectable Display For A Computing Device - Google Patents
Large, Ultra-Thin And Ultra-Light Connectable Display For A Computing Device Download PDFInfo
- Publication number
- US20110098083A1 US20110098083A1 US12/983,168 US98316810A US2011098083A1 US 20110098083 A1 US20110098083 A1 US 20110098083A1 US 98316810 A US98316810 A US 98316810A US 2011098083 A1 US2011098083 A1 US 2011098083A1
- Authority
- US
- United States
- Prior art keywords
- display
- computing device
- image
- camera
- present
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2624—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/72412—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories using two-way short-range wireless interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/16—Details of telephonic subscriber devices including more than one display unit
Definitions
- the present invention relates to digital image devices. More specifically, it relates to a thin and large and light weight display that is connectable to a portable computing device which has a digital camera.
- the camera being able to record single image in one embodiment and at least 2 images concurrently in a second embodiment and generate a stereoscopic image from those at least two images, or a panoramic image or a stereoscopic or panoramic image related to an existing image to be displayed on the thin and large connectable display.
- Digital cameras are increasingly popular. The same applies to camera phones.
- the digital images taken by these devices use a digital sensor and a memory which can store data generated by the sensor.
- Data may represent a still image.
- Data may also represent a video image. Images may be viewed on the device. Images may also be transferred to an external device, either for viewing, for storage or for further processing.
- Panoramic images are also very popular and have been created from the time of photographic film to the present day of digital imaging.
- Stereoscopic or 3D images are also created by combining two images.
- a whole range of tools exists to combine two or more images from a scene into a single, combined, hopefully seamless panoramic image or into a stereoscopic image.
- This process of combining is called registering, stitching or mosaicing, especially in the case of panoramic images.
- An advantage of a panoramic image is to provide a view of a scene that is usually beyond what is usually possible with a common camera and having no or very little distortion.
- For stereoscopic imaging the combining of two images requires to provide the appropriate horizontal parallax to create the right depth illusion.
- the processes of picture taking for creating a panoramic or stereoscopic image are processes that have many different technologies, apparatus and methods. Very common is the method of taking a first picture with a single lens camera, followed by taking at least a second picture at a later time, and followed by combining the pictures. This method is not very user friendly or requires complex cameras or complex camera settings. Furthermore, this method may be troublesome for creating video images.
- Images taken by a small computing device can usually be displayed on a small integrated display screen on the computing device.
- the quality and the size of such integrated display screens is generally barely adequate and mainly serves to provide feedback on reviewing an image, not for long time viewing of video or for high quality display of images.
- a large display screen would enable a better image viewing experience.
- a large display screen integrated into a portable computing device would defy at least some of the benefits of portability and small size.
- One aspect of the present invention presents novel methods and systems for recording, processing storing and concurrent displaying of a plurality of images which may be video programs into a panoramic or a stereoscopic image.
- an apparatus for generating a combined image from at least a first and a second image of a scene with a camera having at least a first lens being associated with a first image sensor for generating the first image and a second lens being associated with a second image sensor for generating the second image, comprising an operational memory, enabled to store and provide data related to a first setting of the second lens including an active sensor area, the first setting of the second lens being associated with data related to a first setting of the first lens, a controller, applying data related to the first setting of the first lens for retrieving from the memory data related to the first setting of the second lens, the controller using the retrieved data for processing image data related to the active sensor area of the second sensor to create a stereoscopic image.
- the camera system is provided, wherein the camera system is part of a mobile phone.
- FIG. 1 is a diagram of a camera for panoramic and/or images in frontal view in accordance with an aspect of the present invention
- FIGS. 2 and 3 illustrate a panoramic image created in accordance with an aspect of the present invention
- FIGS. 4 , 5 and 6 illustrate a stereoscopic image created in accordance with yet another aspect of the present invention
- FIG. 7 is a diagram of a camera in frontal view in accordance with another aspect of the present invention.
- FIG. 8 is a diagram of a camera in rear view in accordance with a further aspect of the present invention.
- FIG. 9 is a diagram of a camera in frontal view in accordance with yet another aspect of the present invention.
- FIG. 10 is a diagram of a camera in rear view in accordance with yet a further aspect of the present invention.
- FIGS. 11-13 are diagrams of a camera in frontal view in accordance with one or more aspects of the present invention.
- FIG. 14 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with an aspect of the present invention
- FIG. 15 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with another aspect of the present invention.
- FIG. 16 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with yet another aspect of the present invention.
- FIG. 17 is a diagram of a system having a camera and a display in accordance with an embodiment of the present invention.
- FIG. 18 is a computing device in accordance with an aspect of the present invention.
- FIGS. 19-21 are diagrams of a camera with a thin large sized display screen in accordance with one or more aspects of the present invention.
- FIGS. 22-24 illustrate detachable displays in accordance with an aspect of the present invention
- FIGS. 25-30 illustrate connectable large displays in accordance with an aspect of the present invention
- FIGS. 31-33 illustrate large displays with embedded lens/sensor units in accordance with an aspect of the present invention
- FIGS. 34-36 illustrate small thickness cameras to be integrated with a large thin display in accordance with an aspect of the present invention
- FIG. 37 illustrates a remote controlled platform for a camera in accordance with an aspect of the present invention
- FIG. 38 illustrates a display in accordance with an aspect of the present invention
- FIGS. 39-40 illustrate a remote controlled platform for a camera in accordance with an aspect of the present invention.
- FIG. 41 illustrates a screen of a display in accordance with an aspect of the present invention.
- FIG. 1 shows in diagram a camera 100 having at least two lens/sensor units 101 and 102 .
- a lens/sensor unit contains a lens that corresponds to an image sensor and the image sensor enabled to generate a plurality of signals generated by picture elements on the sensors.
- the image sensor may be a CCD or a CMOS image sensor or any other type of image sensor.
- An image sensor generally provides lines of pixels as data signals that can be processed by a processor and/or stored in a memory.
- An image sensor is enabled to generate a single image in one mode, and to generate a video image in a second mode.
- An image sensor is associated with a lens.
- a lens may be combined with a sensor in a housing. The combination of a lens with its associated sensor is called a lens/sensor unit.
- a lens in one embodiment of the present invention is a fixed focus lens.
- a lens has a focus mechanism.
- a focus mechanism can be a manual focus mechanism, for instance by rotating the lens.
- a lens has an auto-focus mechanism.
- Part of an autofocus-mechanism can be a distance sensor 108 as illustrated in FIG. 1 .
- the multiple lenses are controlled by a single distance sensor and based on a measured distance all of the lenses of the at least two lens/sensor units are controlled via a single distance sensor.
- the distance sensor is shown as a sensor 108 outside the lenses.
- the distance sensor for focus setting is a through-the-lens distance sensor.
- the at least two lens/sensor units are applied to combine two images into a single three-dimensional or stereoscopic image.
- the camera of FIG. 1 is a stereoscopic camera.
- the at least two lens/sensor units are used to create a panoramic image of a scene. In that case there may be more than two lens/sensor units and the camera of FIG. 1 is called a panoramic camera.
- an active area of an image sensor can be defined, the active area being smaller than the actual pixel area of the sensor. Pixels are generally read in horizontal lines, called scan lines. A sensor can also be read in not horizontal lines, vertical lines, slanted lines or in a pseudo-random pattern.
- the sensor data can be stored in a memory.
- a processor processes only pixels or pixel data related to an active area of an image sensor.
- a memory only stores pixels from an active sensor area.
- a part of the memory is defined that only stores pixels generated by an active area of a sensor.
- a processor s instructed to read and/or process only data from memory related to an active sensor area. In a further embodiment of the present invention a processor is instructed to process data depending on its position in a sensor with regards to the active sensor area.
- lens/sensor unit 201 corresponds to lens 101
- unit 202 corresponds to lens 102 .
- lens/sensor unit 201 corresponds to lens 101
- unit 202 corresponds to lens 102 .
- FIG. 2 shows three objects 203 , 204 and 205 in a scene to be recorded by the camera as a panoramic image.
- the sensors 201 and 202 are in transformed positions relative to each other, including a translation and a rotation.
- a merge line 206 and 207 is determined. These merge lines help define the active areas of the sensors.
- An active area of a sensor is the area of pixels that will be used to create and will appear in the panoramic image. The merge lines and thus the active area may change with camera setting and/or distance to objects or other imaging conditions.
- Both sensors have an area of overlap, so when their images are combined along the merge line 308 , as shown in FIG. 3 , a panoramic image 300 is formed.
- One has to address the rotation before images can be merged.
- One may adjust for rotation by for instance instructing a processor to read pixels in such a manner that the read pixel lines are aligned.
- One may also create an adjusted scanning direction based on a measured rotational misalignment. These adjustments including interpolation take place preferably before demosaicing of the raw image data.
- areas and parameters in image data are defined to perform for instance smoothing and blurring and pixel equalization to create a smooth transition between two active pixel areas.
- the image data is stored and/or displayed as a panoramic image, or the processed or raw but aligned data is stored in such a manner that a panoramic image is formed by merging two images each created from active sensor areas. (keeping in mind that one may also use three or more lens/sensor units.
- the camera with at least two sensor/lens units is calibrated for different conditions such as distance or focus setting, to associate a condition such as focus setting with all required settings to create an appropriate panoramic image and store those settings in a memory.
- a controller retrieves all associated settings from the memory and applies all the settings in processing image data to generate the panoramic image.
- FIG. 4 shows a stereoscopic camera 400 with two lens/sensor units 410 and 402 .
- camera 400 has at least one lens focus sensor.
- camera 400 has an autofocus mechanism that focuses both lenses 401 and 402 . The distance between the lenses should be sufficient to create the appropriate stereoscopic image.
- the image sensors are shown as 403 and 404 and are in mirror image of the front view of the camera 400 .
- Lens/sensor unit 404 is shown to have a translation and rotation relative to 403 .
- the rotational misalignment is an angle ⁇ r and the vertical (translational) misalignment is dv.
- the vertical misalignment may be measured from an object 409 which is recorded as 409 a by 403 and 409 b by 404 .
- vertical alignment or merge lines 405 and 406 are defined in image sensors 403 and 404 .
- the objects need to be positioned on a display screen with a parallax displacement that is proportional to the distance of the object to the camera.
- the horizontal displacement of the images for the instant distance is dh which is defined by for instance the distance between the line 406 defined by a feature of the object and the parallax line 409 .
- the images also have a vertical displacement dv defined by lines 407 and 408 . In general one would like the vertical displacement to be close to 0.
- FIG. 5 shows the aligned images to be appropriately displayed to create a 3D image. This may be done by a lenticular lenses display screen, by using parallax barriers, by using anaglyphs, by using alternately viewing left and right images by for instance using switching glasses or by any other appropriate 3D viewing system.
- the rotational alignment is addressed as in the panoramic case. It is known that for most comfortable viewing by a viewer the rotational misalignment should be minimized.
- the horizontal and vertical alignment can be addresses by reading the appropriate pixel lines at the appropriate starting point. As in the case of the panoramic imaging one should calibrate the alignment for different conditions, for instance for different distances, zoom factors, light conditions and the like.
- FIG. 6 shows the combined images to generate a 3D image with appropriate horizontal parallax and no vertical displacement.
- the camera in one embodiment of the present invention, is calibrated for different conditions such as distance or focus setting, and a condition such as focus setting is associated with all required settings to create an appropriate 3D image and store those settings in a memory.
- a controller retrieves all associated settings from the memory and applies all the settings in processing image data to generate the 3D image.
- a camera is provided with three lens/sensor units for creating panoramic pictures. This is illustrated in FIG. 7 with lens/sensor units 701 , 702 and 703 in camera 700 .
- an autofocus mechanism including a focus sensor is included.
- the lenses are single focus lenses.
- FIG. 8 shows the back side of the camera 700 with a display screen 800 which in one embodiment of the present invention is divided in at least three segments 801 , 802 and 803 .
- the camera 700 contains at least a processor to process image data and to control the settings of the camera and the display screen, wherein the button 804 for instance can be used to select a display mode. Such a button may take the camera through consecutive modes by pushing it briefly and a mode is selected by holding the button for a longer period of time.
- the camera also contains memory to store pre-set camera and display settings, which have been entered for instance during a calibration.
- the camera also contains a memory to store image data.
- the camera also contains a processor to process the sensor data according to preset settings and the memory can hold the processed image data, which can represent a still image or video images and can be a single lens/sensor unit based image or an image constructed from at least two lens/sensor units.
- controller for instance for controlling lens focus, lens aperture and shutter speed, memories, sensor data and image processor are connected internally for instance by a data bus.
- controller for instance for controlling lens focus, lens aperture and shutter speed, memories, sensor data and image processor are connected internally for instance by a data bus.
- Such structures are well known to one of ordinary skill in the art and are not shown in the figures to prevent obscuring the aspects of the present invention with known matter. Details of an internal processing and communication architecture of a digital camera is disclosed in for instance U.S. Pat. No. 7,676,150 to Nakashima issued on Mar. 9, 2010 and U.S. Pat. No. 7,667,765 to Turley et al. issued on Feb. 23, 2010 which are both incorporated herein by reference.
- a control button enables a menu that is displayed on at least a segment of the display, allowing a user to select a mode of display and of taking an image.
- An image is recorded in a certain mode when the recording is activated by a button, for instance button 807 .
- all images recorded by the sensors are stored and the selected mode determines how the recorded image is displayed. For instance, in one mode only the image taken by the center lens/sensor unit is displayed. In another mode the created panoramic image is displayed.
- the display screen functions as a view finder.
- the images are shown in that mode, but are not recorded.
- the display acts as a view finder and an image is recorded.
- people take pictures from a scene in front of them.
- the display of the camera acting as a viewer, continues to display the images thus acting as a light for a person on a distance behind the camera.
- a button is provided on the camera that switches off the display while keeping the camera in a recording mode.
- Such a button in a further embodiment switches the image display to a separate handheld display that shows the image recorded by the camera while the embedded display of the camera is switched off. This embodiment of a separate handheld display of a camera will be further explained and provided herein.
- an image is displayed on the display screen that is not being generated by one of the lens/sensor units but is for instance an archival image that is stored on the camera or that has been made available and transmitted to the display by another device.
- the image is being displayed on the display screen that serves as a background to compare with an image from one or more lens/sensor units of the camera.
- the image may be displayed in a low intensity. It may be enhanced with edges of objects in the image by an edge detector such as a Canny edge detector as is known in the art.
- the images viewed on the display screen that are provided by one or more of the lens/sensor units may be displayed in a transparent mode that overlays the archival or externally provided image.
- This embodiment of the present invention allows one to retake an image of a scene as close as possible to the original setting.
- the original picture may have been in black and white, or a picture with or without persons or objects.
- An image may also have been taken under different weather conditions, during a different season, time of day, or any other conditions that have been changed.
- One may want to retake an image or video that is as close as possible to the original setting or frame. For instance, a picture of a scene was taken with a single lens/sensor camera, and one wants to retake the picture as a video or as a panoramic or a 3D image.
- the overlay mode provided herein as one aspect of the present invention allows to retaking that image as closely as possible to the original image.
- Image registration by comparing edges and transformation of images is known.
- Some aspects of using an archival image on a camera to improve an image are known and are described in U.S. Pat. No. 7,616,248 to Parulski et al. issued on Nov. 10, 2009 which is incorporated herein by reference.
- the archival images in Parulski are displayed and applied differently and not to retake an image that was taken at a completely different time. Parulski also does not provide an image that is provided by another camera. Parulski also does not disclose image matching and does not disclose using the archival image as a template for retaking the image as a video or as a panoramic or a 3D image.
- the camera 700 also has an external port to dock to another device.
- a port may be a USB port, which allows the camera to be connected for instance to a computing device such as a computer for exchanging images.
- the port may also be used to connect the camera to a display having a large display screen to display the images taken by one or more lens/sensor units.
- the port may also be a wireless port.
- a user may have a computing device such as a mobile phone that has a camera. Such camera may not be enabled to take images in a panoramic or a 3D mode.
- the camera 700 in a further embodiment of the present invention as illustrated in FIG. 8 has a communication port 806 that communicates directly with a mobile computing device such as a mobile phone. Such a port can be enabled to work in a wired mode. It may also be configured to work in a wireless mode.
- the camera may have a port 806 that is a Bluetooth port that can exchange data with a mobile phone.
- the mobile phone can provide camera 700 with an image that serves as an archival image.
- the camera 700 in one embodiment of the present invention provides the mobile phone with image data, for instance of a panoramic image.
- the cell phone may have a different resolution display screen than the camera 700 .
- images are stored in a high resolution and are downsampled on a device to be displayed in an appropriate resolution on a display screen.
- the above provided capability of importing and/or comparing a generated image with an archival image enables a further aspect of the present invention.
- the camera has processing capabilities to segment an object from its background.
- image segmentation algorithms are well known and are for instance described in an online lecture notes “Segmentation” by Professor Allan D. Jepson downloaded from ⁇ http://www.cs.toronto.edu/ ⁇ jepson/csc2503/segmentation.pdf> which is dated as Fall 2009 and which is incorporated herein by reference.
- the segmented image is inserted into an actual image being taken by a lens/sensor unit in the camera.
- One may modify the size of the inserted image by opening a menu on the camera to be shown on the display screen 1002 as shown in FIG. 10 .
- An input device 1003 with different button modes can be used to activate a menu and select a menu option.
- Device 1003 may also act as a mouse to move a cursor over the display screen, to draw and select areas of interest which may contain an object and activate a menu option, which may be an object segmentation.
- One can thus select to be segmented object for instance by drawing a box around the object, which may be a person.
- the result being an image of a segmented object, can then be selected by either drawing again a box around it or by clicking on 1003 after selecting it with a cursor.
- the segmented object may be displayed as a transparent image over the actually imaged scene by a lens/sensor unit. When satisfied with the size and position a user may confirm a position of the object on the display screen by button 1003 and change the image from transparent to opaque.
- the inserted image is synthesized into a 3D image from a 2D image.
- Methods and apparatus there to are known and are for instance disclosed in U.S. Pat. No. 6,445,833 to Murata et al. issued on Sep. 3, 2002 which in incorporated herein by reference.
- a 2D image is synthesized into a 3D image with a parallax that reflects a desired distance from a point of view on the display screen.
- the camera may be connected to a network, and receive a plurality of images from different sources via the network.
- each image has an identifier that allows the camera to determine its relative size and earlier positions a particular segmented image had on the display screen. This allows the camera to position the image on a location and at a size that it had in a previous image.
- a display with a display screen is separate from the camera.
- the camera may be connected via a communication port to a computing device having its own display screen, such as a PDA or a cell phone.
- control of the camera is provided to the computing device such as cell phone.
- the computing device can open at least one menu on the display screen of the computing device that enables control of the separate camera. This allows a user to be part of an image taken by the camera while retaining control of the camera.
- the display and camera are physically separate and are connected through a wired connection. In a further embodiment of the present invention the display and camera are physically separate and are connected through a wireless connection.
- Imported images may be recorded in stereoscopic format.
- one may either receive the constituting separate 2D components of each image, or one may separate the stereoscopic image into the constituting 2D components and reassemble the components with a parallax representing a desired distance from a point of view on the device.
- an imported 3D image is reduced to a 2D image and the 2D image is synthesized into a 3D image having a parallax that reflects a desired distance from the object to the point of view.
- a plurality of users have a camera with remote control and display screen as provided above.
- Each camera is trained and focused on the user and generates a segmented image that is shared with at least one other camera.
- a camera 900 is provided for creating a 3D image from at two lens/sensor units 701 and 703 as shown in FIG. 9 . While the camera may look the same as a 2 lens/sensor units panoramic camera, there are differences. In both the panoramic and 3D cameras the sensors have to be aligned appropriately. However, in the 3D case the lenses (or lens optical axis) should be substantially or almost parallel. In the panoramic case the lens optical axes should be positioned at a slight angle to create a broader field of vision.
- a connectable or even dockable device that can take and/or display panoramic and/or 3D images. For instance, one may take with a camera a panoramic image and send it to the device that is connected with the camera. One may also take a 3D image with the camera and display it on the camera, and send the single lens image for display to a connected computing device or the 3D image for storage and/or further transfer and/or non-3D display to the connected device.
- a camera, in accordance with an aspect of the present invention is a 3D camera having at least 2 lens/sensor units.
- a camera, in accordance with another aspect of the present invention is a panoramic camera having at least 2 lens/sensor units.
- a camera, in accordance with an aspect of the present invention is a combined 3D and panoramic camera having at least 4 lens/sensor units.
- One embodiment of the present invention 1100 of such a combined 3D/panoramic camera is shown in diagram in FIG. 11 .
- a combination of two lens/sensor units 1101 and 1102 may be dedicated to 3D images and the optical axes of the lenses are parallel or almost parallel. This allows one lens/sensor unit of 1101 and 1102 to be dedicated to generate a single reference image.
- the distance between the lenses of 1101 and 1102 is in one embodiment of the present invention preferably a human parallax distance.
- At least two additional lens/sensor units 1103 and 1104 in 1100 may be dedicated to taking panoramic images. To create a wide field of view the optical axes of the lenses are positioned at an angle.
- an additional lens/sensor unit 1201 is placed between 1103 and 1104 as is shown in a diagram of a camera 1200 in FIG. 12 .
- This allows for a higher quality panoramic image. It also allows for using an image generated by 1201 to generate a neutral or reference image.
- FIG. 13 of a camera 1330 all lenses for 3D and panoramic image generation are placed on a single line.
- the lens/sensor units 1301 and 1302 are for instance for generating 3D images, while 1303 , 1304 and 1305 are used for generating a panoramic image.
- a camera may have at least two sets of two lens/sensor units wherein each set has lenses on parallax distance with parallel optical exes, and wherein the optical axes of the first set and the at least second set has an angle to create at least two panoramic images which are combined into a panoramic 3D image.
- FIG. 14 illustrates another aspect of the present invention. It shows a lens/sensor carrier 1400 with lens/sensor units aligned attached. It was explained above that sensors have to be rotationally aligned to create appropriate panoramic images and to diminish negative effects on 3D images.
- One way to assure alignment is to attach at least two lens/sensor units 1403 and 1404 to a rigid carrier material 1401 .
- the lens/sensors units can be aligned during manufacturing and/or assembly of 1400 and should be fixed in their aligned position.
- Material 1401 should be rigid with negligible bending and distortion over the life-time of the camera. This requires that the carrier will also not be bent or distorted after being assembled into a camera.
- a motor 1405 that can rotate the lens/sensor unit can be included on the carrier 1400 . This allows for motor driven alignment of the lens/sensor unit.
- the motor can be a stepping motor with a mechanism that significantly reduces the rotation angle of the lens/sensor unit per step. Per motor step the rotation of the lens/sensor unit is well below 1 degree in one embodiment of the present invention, and smaller than 1 minute in yet another embodiment of the present invention.
- a motor may be an actuator such as a piezo-electric actuator. Piezo-electric actuators are available that has a minimum displacement that is below 1 micron, thus allowing rotational displacement of a sensor that is in the single pixel range.
- FIG. 15 shows a further embodiment of the present invention of a lens/sensor assembly 1500 with 3 lens/sensor units.
- an additional lens/sensor unit 1502 is provided.
- lens/sensor unit 1403 is provided with a motor or actuator 1501 to allow mechanical alignment of the sensors.
- FIGS. 14 and 15 illustrate the carriers 1400 and 1500 in front view.
- FIG. 16 shows in diagram the carrier 1500 in side view.
- FIG. 16 shows that the optical axis of 1404 is rotated by a slanted support wedge 1502 .
- the rotation angle effected by the support in one embodiment of the present invention is less than 10 degrees.
- the rotation angle effected by the support in another embodiment of the present invention is more than 10 degrees but less than 30 degrees.
- the rotation angle effected by the support in another embodiment of the present invention is more than 30 degrees
- lens/sensor units may be placed on a carrier and bonded to the carrier in a high precision environment.
- at least two sensor units may be accurately placed on the carrier and at least rotationally aligned relative to each other.
- a carrier becomes an integrated panoramic and/or 3D imaging unit that is robust and can be placed in a housing of a camera.
- Electronic components and communication connection may also be placed on the carrier.
- the most sensitive stage, which is probably the rotational alignment can take place in a manufacturing or laboratory environment that accommodates this type of high precision positioning and bonding.
- Such high precision environment is for instance present in semi-conductor processing and is known in semiconductor steppers.
- the carrier unit with lens/sensor units fixed in aligned position one may install additional components such as lenses, zoom lenses, auto-focus mechanism, focus mechanism for the lenses if required (when the lenses are not fixed focus lenses, which is also possible in a further embodiment of the present invention) and zoom lenses, aperture setting mechanism, shutter mechanism, light and distance sensors and the like which are all known components in photographic imaging.
- additional components such as lenses, zoom lenses, auto-focus mechanism, focus mechanism for the lenses if required (when the lenses are not fixed focus lenses, which is also possible in a further embodiment of the present invention) and zoom lenses, aperture setting mechanism, shutter mechanism, light and distance sensors and the like which are all known components in photographic imaging.
- the carrier with at least the lens/sensor controllers and supporting electronics and connectors may be in a shape and completeness that allows a camera manufacturer to obtain a complete either panoramic, 3D or 3D/panoramic unit that can be assembled with housing, buttons and knobs, and a display screen into either a complete panoramic, 3D or 3D/panoramic camera.
- the carrier may include an external communication link or may be attached to a communication link or device in the housing that allows communication such as the exchange of data, which may be image data or control/instruction data with the outside world, such as a computing device.
- the calibration steps that were already disclosed as an aspect of the present invention are performed during or right after an initial power up of the electronics on the carrier with the lens/sensor units attached and the settings stored in for instance a memory, such as a ROM memory. If one achieves a reliable level of accuracy and repeatability of creating the carrier with lens/sensor units attached one may skip the actual calibration step for the mass produced carriers, and replace this by downloading the calibration settings from an external memory or implement them in an electronic device on the carrier.
- the calibration steps are determined from a mother or standard carrier, which may be used to test the performance of mass produced carriers against.
- a camera is provided that is a panoramic camera, a 3D camera or a 3D/panoramic camera.
- the camera may be a stand alone camera. It may also be part of a computing device or a mobile phone.
- the camera has a communication interface to communicate with a network or with a computing device to exchange data, including image data.
- the camera may transfer image data to the network or computing device.
- the camera may also transfer status information to the network or computing device.
- a computing device may have a display screen for displaying an image taken by the camera.
- the camera may also receive data from the network and/or network.
- the camera may receive image data, instructions to perform certain tasks or processing updates from the network and/or computing device.
- the camera may be connected to for instance a PDA or mobile phone to act as an external imaging device.
- a connection between the camera and computing device such as a mobile phone may be established via a wired connection, including a USB port or a wireless connection such as a Bluetooth® connection.
- the camera may also have a connection directly with a display screen wherein a display screen is enabled to display image data but is not enabled to run general purpose computer programs.
- Devices to display stored documents are known, for instance as eReaders from for instance Sony® and Amazon's® Kindle®.
- Apple's® iPAD® has its own display screen, but has additional computing capabilities and is in fact its own computing device.
- Computing devices with a display screen are another embodiment of the present invention with which the camera as provided as one or more aspect of the present invention can communicate with to display image data in real-time or substantially in real-time. Some minimal transmission delay and buffering delay may create a somewhat delayed real-time experience.
- real-time or substantially real-time display of an image, graphics, text, or an interface generated by a computing device is different from a computing device completing an image, text, document or otherwise completed file, transmitting it to a display and the display processing the file to a display screen.
- FIG. 17 illustrates a display embodiment of the present invention.
- the camera 1700 which may be a panoramic/3D camera in accordance with an aspect of the present invention has a communication port/interface 1701 to communicate with a communication port 1708 of a display 1705 .
- the display 1705 has a display screen 1706 which may be enabled to display 3D images.
- the display has display screen controls 1707 to allow a user to provide instructions related to an image. The controls may be used to adjust contrast, intensity, size of images, instructions to move forward, back ward in a series of images, zoom, capture, store recall images and the like.
- the display 1705 also has a processor and memory/storage to allow processing of an image and storage and retrieving of images.
- storage capabilities are temporary and limited to a session, for instance lasting for a period of time that the camera 1700 is connected to the display 1705 .
- processing capabilities of the display 1705 are limited to image processing and do not provide general computing capabilities.
- the display is connected to a mobile computing device 1800 as illustrated in FIG. 18 .
- the mobile computing device 1800 may include a cellphone with a dial pad 1803 a display screen 1802 a pointing device 1804 cellphone capabilities and an antenna 1801 to connect to a communication network. It also has a communication interface 1805 which allows communication with display communication interface 1708 . Device 1800 also may have a communication interface to communicate with camera interface 1702 . Communication between device may be wired or wireless communication links.
- the device 1800 has a powerful processor 1806 enabled to run an operating system such as Microsoft Windows® or Linux or any other operating system and applications such commonly used on a computer such as a wordprocessor, presentation software, communications, e-mail, browsers, spreadsheets, games, scientific software and the like or any other computer program that can run on a computing device.
- the device 1800 also has mass storage capabilities 1807 to store programs and data and the like.
- the device 1800 has a projector 1808 which is enabled to project on a screen or a wall an image. It may also have a projector 1809 which is enabled to create a virtual keyboard or input device, such as disclosed by U.S. Pat. No. 7,340,077 to Gokturk issued on Mar. 4, 2008 which is incorporated herein by reference.
- the size of device 1800 is very small, and can be carried in a jacket or trouser pocket or a purse and is significantly smaller than present day laptop computers such as manufactured for instance by Hewlett Packard® or Apple®. Such a complete computer can be so small because they do not have bulky input and output devices or storage media such as large magnetic disks.
- the device 1800 has included the camera 1700 .
- the problem of lacking input/output devices can be solved with locally positioned display devices 1705 , which may be eReader devices as disclosed above.
- the devices may have a keyboard or a virtual keyboard for instance created with screen 1706 being a touch screen or a projector 1710 which may project a virtual keyboard.
- a display is part of an airplane seat.
- a keyboard may also be part of a seat.
- the display and keyboard have communication ports to connect to a computer 1800 of a passenger.
- a camera is part of an eReader. This is shown in diagram in FIG. 19 .
- the term eReader herein is used as a term for a thin, large, lightweight display that acts as a true display, enabled to provide still and video images in full color in real-time, but which is much thinner than a tablet such as the Apple iPAD®.
- An eReader in its known form displays a content of a file.
- the term eReader as an aspect of the present invention includes the capability to show screens and images in real-time.
- a computer requires a processor, storage, input device and a display.
- Current devices are compromises along the different properties of a computing device. If they are small, they have usually small displays. With larger displays, computers become bulky and look like the current tablets or laptop computers.
- a very portable, mobile and fully usable (with no screen limitations) portable and mobile computer has been provided.
- the unexpected aspect is that the usability of the device is improved by separating screen/display from the actual processor containing device, which is nowadays common practice in laptops, PDA, mobile phones, smart phones, known eReaders, tablets and the like.
- the novel eReader 1900 has a screen 1901 . Such a screen is larger in surface than a standard camera display screen.
- a lens 1902 which may be a plurality of lenses as discussed above, each lens associated with an optical sensor is/are captured or manufactured into a housing of the eReader or other portable device 1900 that has a screen.
- eReaders currently have a black/white screen which is known as ePaper or elnk
- manufacturers at the time of this invention have indicated to move into the area of color display screens.
- One embodiment is by the technology of Organic Light Emitting Diode or OLED display screens.
- FIG. 19 a single lens camera as is shown in FIG. 20 is part of an eReader. Its shows an eReader housing 2000 with a screen 2001 . Incorporated in the housing is a single lens/sensor unit 2002 .
- an eReader is provided that is also a 3D camera or a panoramic camera or a 3D/panoramic camera. These cameras are contemplated to have one or more aspects of the cameras as already described herein.
- the advantage is that the display has a large display screen and is portable and preferably has a light weight.
- the size of a screen in one embodiment of the present invention is not smaller than a diagonal measurement of 6 inches. In another embodiment of the present invention a size of a screen is not smaller than a diagonal measurement of 10 inches. In another embodiment of the present invention a size of a screen is not smaller than a diagonal measurement of 15 inches.
- the length and width of a display screen is the same. In another embodiment of the present invention the length is greater than the width.
- the large screen size allows an un-paralleled direct view of a recorded image. Even at relatively low resolution this is an attractive option.
- the current Apple iPAD® has a color screen with a resolution of 1024-by-768-pixel resolution at 132 pixels per inch.
- a novel eReader with a camera and a screen of such a resolution as provided as an aspect of the present invention will provide a completely new experience and enjoyment of taking pictures and video images with instant and real-time review capabilities.
- a novel eReader is provided as an aspect of the present invention to include additional functionality as described previously taking pictures or videos.
- devices 1900 and 2000 are any portable computing device with a display screen, with a lens or plurality of lenses associated with an image sensor or a plurality of image sensors 1902 captured or manufactured into a housing of the device 1900 .
- FIG. 19 shows how lenses 1902 are provided at the opposite side of the housing related to the screen or display.
- One may provide a housing with lenses that is hinged to a housing of the display, so one can swivel the lenses in relation to the display.
- the devices as shown in FIGS. 19 and 20 are only a camera with a very large display screen and memory to store data and processing capabilities to display images from an image sensor or from memory on the display and a control to control the camera and the screen.
- no other functionality is required and the devices are just digital cameras with a very large viewing screen.
- other capabilities are added.
- these cameras with very large screens have a communication port to off-load data such as image data, for instance to a computing device.
- these device are also enabled to receive data through a communication port.
- these large display screen cameras have additional processor and programming capabilities.
- the realization of this type of camera and display device is enabled by providing thin, sturdy, light weight housing, small and highly integrated powerful processing, memory and communication capabilities, very small lens/sensor units with appropriately sized connectors, light weight display screens and appropriate power sources.
- This will enable a portable camera or display device that is ultra-light and ultra-thin and thus ultra-portable with a very large display screen offering a novel experience in still image and video recording and viewing.
- the large display screen camera has only one lens/sensor unit.
- the large display screen camera has at least two lens/sensor units.
- the camera with large display screen is characterized by its format and its light weight and being thin. It is portable and easy to carry and to store for instance in a briefcase or portfolio. It has a format that provides a large display screen, that is thin and low weight.
- the display screen has a diagonal that is larger than 10 cm in one embodiment of the present invention, larger than 15 cm in another embodiment of the present invention, larger than 20 cm in yet another embodiment of the present invention and larger than 30 cm in yet another embodiment of the present invention.
- the thickness in one embodiment of the present invention is less than 3 cm, less than 2 cm in another embodiment of the present invention, less than 1 cm in yet another embodiment of the present invention and less than 0.5 cm in yet another embodiment of the present invention.
- the weight in one embodiment of the present invention is less than 20 ounces.
- the weight in another embodiment of the present invention is less than 15 ounces. In yet another embodiment of the present invention the weight is less than 10 ounces.
- the weight and thickness will be determined by power supply, housing, and display screen and the way of selecting, modifying and combining these elements into a device.
- the lens/sensor unit may be an integrated lens/sensor unit for instance as available from Omnivision of Santa Clara, Calif. of which one example is the CameraCube. Samsung of Korea also manufactures very thin and powerful camera modules. For instance on URL ⁇ http://www.engadget.com/2008/03/18/samsung-announces-worlds-thinnest-8-megapixel-cellphone-module/> on Mar. 18, 2008 a small ultrathin camera module was shown that was about 8.5 mm thin and has a 5 Megapixel sensor and a lens. It is about 1.5 by 3 cm in width and length. Such a module can be incorporated in a thin housing of a large display or of a reader. The module has internal processing capabilities.
- a connector that makes it easy to connect to other components of a camera, to a processor, to a controller, to a bus or other communication devices and to a display.
- modules can be incorporated in the housing to enable a camera as disclosed herein in accordance with one or more aspects of the present invention. A copy of the article/website is incorporated herein by reference.
- Camera controllers are also known and are for instance disclosed in U.S. Pat. No. 6,727,941 to Coleman, issued on Apr. 27, 2004 which is incorporated herein by reference.
- a large display screen system with a controller is the iPAD® of which a teardown is provided in ⁇ http://www.isuppli.com/News/Pages/User-Interface-Focused-iPad-Changes-the-Game-in-Electronic-Design-iSuppli-Teardown-Reveals.aspx> which is incorporated herein by reference.
- a large display screen portable device such as a novel eReader, is enabled to communicate with a camera, such as a camera phone, by wired or wireless communication and is enabled to display an image or video taken by the camera.
- a large display screen camera such as provided in FIGS. 19 and 20 in a further embodiment of the present invention are provided with a communication device to communicate with another device or with a network.
- a communication device to communicate with another device or with a network.
- a Bluetooth interface of which an extensive description is available on ⁇ http://www.bluetooth.com/English/Technology/Building/Pages/Specification.aspx>.
- An example of a network communication interface is the Wi-Fi interface of which a specification is available on ⁇ www.wi-fi.org>.
- elnk or similar displays are used either in black/white or color, one issue may be the refresh rate of the screens. Refresh rates as low as 0.1 sec have been reported. Though refresh rates of 0.5 up to 1 sec are also common. High quality video requires about 50 or 60 frames per second. These can be achieved by high quality display screens. In case low refresh rate display screens are used one may still display ‘sampled’ video images.
- a display has a controller with a clock that selects images from a sequence of video images that may be stored on a memory in the camera to provide a lower video frame rate to accommodate the display screen. In one embodiment of the present invention a controller selects fewer than one in 10 of the frames from an image sequence to be displayed.
- a controller selects fewer than 25 fps but more than or equal to 10 fps from an image sequence to be displayed.
- a controller selects more than 25 fps from an image sequence to be displayed. This means that a camera records high video rates, up to 60 fps or higher and stores these video sequences, for instance to be played on an external display, but selects a lower display screen rate to accommodate a refresh rate of a current display.
- a camera records images at a lower than 60 fps rate.
- a display screen of the camera may be the main display screen for displaying images. No higher quality fps display in such an embodiment of the present invention is envisioned, and only a reduced video frame rate is implemented.
- the recording rate is less than 10 fps.
- the video rate is lower than 25 fps but higher or equal than 10 fps.
- the video rate is equal to or higher than 25 fps.
- a device having a large display displays image data that is to be displayed on a screen that is of a higher resolution quality than is displayed on the current display screen.
- the device with the large display screen performs downsampling of the higher resolution image data.
- a separate computing device performs downsampling of the higher resolution image data before it is transmitted to the display device.
- a thin camera device 2100 with a large display screen 2101 as shown in diagram in FIG. 21 has a large display screen that is rectangular but not square.
- One preferred way to display an image such as a still image or a video is in landscape format as shown in FIG. 21 .
- the device 2100 is a multi-purpose computer or an eReader for instance. As an eReader the device can be used in portrait mode.
- a multi-lens unit 1902 as shown in FIG. 19 in the shown position can take 3D and panoramic images in landscape mode. One may want to take images in landscape mode even if the display screen is oriented in portrait mode. This is shown in an embodiment of the present invention as shown in FIG. 21 .
- the multi-lens unit 2102 is part of a rotatable housing.
- This housing can be rotated by pushing for instance a protrusion 2103 from position 2104 (in landscape position) to position 2105 (in portrait position).
- position 2105 the image can be shown in diminished size as a landscape image on a display screen in portrait orientation.
- the image can be stored and be displayed as a landscape image on the display screen in landscape orientation also.
- Stand-alone digital picture frames are increasingly popular. These allow the display of a picture taken by a digital camera on a computer display screen. Such a digital picture frame is for instance disclosed in U.S. Pat. No. 6,975,308 to Bitetto et al. issued on Dec. 13, 2005 which is incorporated herein by reference. However, these digital displays require a connection to connect to a computer device and/or a camera. In one embodiment of the present invention, a digital picture display is provided wherein the camera is part of the display as is shown in for instance FIGS. 19-21 .
- a display screen of a portable and self contained computing device is part of a detachable portable and self contained display. This is illustrated in FIG. 22 with a computing device 2200 and a detachable display 2206 with a screen 2211 .
- the computing device 2200 communicates with the display 2206 through a port 2201 in the computing device with a communication port 2205 in the display 2206 .
- the communication ports 2201 and 2205 may communicate in one embodiment of the present invention in a wired manner. This may be for instance a wired USB connection.
- the communication ports 2201 and 2205 are wireless communication ports that include wireless circuits and antennas to communicate. For instance the connection may be a wireless Bluetooth connection.
- a display screen herein is preferably a computer display screen, wherein an image is treated as an image of pixels and thus can be processed by a digital processor and can be stored in a memory or a buffer for temporary storage.
- a wireless connection between a display and a computing device is disclosed in US patent Application Publ. Ser. No. 20060061951 to Hara published on Mar. 23, 2006 which is incorporated herein by reference.
- the computing device 2200 and the display 2206 are detached and are both self contained, they both require their own power source, which may be a battery.
- a battery 2207 to power the display 2206 has been identified in FIG. 22 .
- the display 2206 has an inductive receiver 2207 to power the display. Inductive power in that case will be provided from an external source and enables the display 2206 to receive inductive power with power receiver 2207 .
- This aspect of inductive power has been disclosed in U.S. Pat. No. 7,741,734 to Joannopoulos et al. issued on Jun. 22, 2010 which is incorporated herein in its entirety.
- the wired connection such as the USB wired connection can be used to have the power source of the computing device power the display.
- a USB connection is preferably a real-time USB connection and not a file oriented connection.
- a separate wire to a power source in another embodiment can also be used to power the display.
- the USB mini-A or mini-B connector can be used in a wired connection with a USB connection.
- the actual receptacles and plugs therein are 3.1 mm and are sufficiently small to be used in a display housing that is 5 mm thick. However, the housing of the USB mini receptacles are about 8.1 mm thick.
- a mini USB receptacle is modified to fit in a display housing that in one embodiment is 7 mm thick or smaller. In accordance with an aspect of the present invention a mini USB receptacle is modified to fit in a display housing that in one embodiment is 5 mm thick or smaller. In one embodiment the receptacle of a USB Micro may be used, if needed modified to fit the needs of connecting the display with the computing device and allowing the computing device to power the display. In a further embodiment the housing of the receptacle is part of the housing of the display.
- the state of the art is sufficient to enable a wired connection between a computing device and a display with a receptacle in the display wherein the display thickness is smaller than 5 mm down to a display thickness of 3 mm and wherein the receptacle is part of the housing and the actual connector is about 1 mm thick.
- a battery is an ultra-light and ultra-thin battery made of nanocomposite paper as disclosed in the article “Flexible energy storage devices based on nanocomposite paper” by Pushparaj et al. in Proceedings of the National Academy of Sciences (PNAS) Aug. 21, 2007 Vol. 104 no. 34 pages 13574-13577 downloaded from its website www dot pnas dot org and US Patent Publication Ser. No. 20100050779 to Pushparaj et al. published Mar. 4, 2010 which are both incorporated herein by reference.
- a very thin battery not thicker than 1 mm is disclosed in U.S. Pat. No. 6,696,198 to Okahisa et al. issued on Feb. 24, 2004 which is incorporated herein by reference.
- the computing device 2200 and the display are combined as one apparatus.
- the computing device 2200 and the display 2206 are connected and combined, for instance through a separable hinge with male part 2203 with an axis and spring loaded ball connector 2209 and a female hinge part 2204 on the display 2206 that is enabled to receive and lock the spring loaded ball connector axis 2209 .
- a user may separate the display 2206 from the device 2200 .
- the separated display in one embodiment of the present invention enables a display to collaborate with the computing device within at least 1 meter distance from the computing device of each other.
- the display can collaborate with the computing device in a distance not exceeding 3 meter of each other.
- the display can collaborate with the computing device in a distance not exceeding 10 m.
- the display can collaborate with the computing device in a distance exceeding 10 meter of each other.
- the computing device has at least one lens/sensor unit 2201 which allows the device to record an image which may be a video image.
- the computing device 2200 is a camera phone or a wireless phone with a camera.
- the computing device 2200 is a camera.
- the detachable display has several advantages.
- One advantage is that one can hold up the camera in a position for instance above one's head where it would be almost impossible to conveniently view its display.
- By detaching the camera one can hold up the camera above one's head and still view the to be recorded scene on the detached display.
- the display 2206 has a local control 2210 that activates the camera part of the computing device 2200 . This also allows a person to make an image from himself/herself.
- a computing device 2300 as illustrated in FIG. 23 has at least two lens/sensor units 2301 and 2302 , which may be applied to generate a panoramic image or a stereoscopic image, which can be a video image.
- the display 2206 in FIG. 22 is illustrated from it's a view when it is folded open when attached. When folded closed a screen 2211 is visible.
- the screen when taking an image without detaching the display but with the display folded open, the screen is not visible by the user in one embodiment.
- a hinge part 2112 may be provided to the display that allows the display to be hinged on the computing device so it faces the user.
- the detachable display has a screen at both of its sides, as is illustrated in FIG. 24 with a screen 2401 .
- a computing device has an integrated screen that cannot be removed and a detachable display that has at least one screen.
- the detachable screen in a further embodiment has two screens, one at each large flat side.
- a computing device such as a PDA, a cell phone, a smart phone, a camera and the like has its own display screen.
- a display is generally an integrated display that is not detachable from the computing device.
- cell phones and smart phones and the like are starting to become the main mobile device, including the main camera, for consumers.
- One or more aspects of the present invention allow the mobile phone to become the primary computing device for a consumer.
- a small portable computing device which may include at least one lens/sensor unit with a very portable thin and light weight detached display with a large display screen.
- Such a display in accordance with an aspect of the present invention is separate, portable and self contained.
- the computing device display is able to recognize the display or vice versa the display is able to at least receive and use signals from the computing device.
- signals to the display from the computing device are provided via a wired connection.
- signals from the computing device are received by the display via wireless communications, for instance via radio signals such as via a Bluetooth or a wireless USB protocol, or via any other wireless communication that allows the display screen to show images generated by the computing device in real-time and not merely as files.
- FIG. 25 illustrates such a display and the system it may be part of.
- the display 2501 has a housing 2502 and at least one screen 2504 .
- the display with all components and housing is preferably ultra-tin and light and does not consume much energy, preferably a minimum amount of energy that can be provided by an inductive source that can be received by the display to power the screen or by an ultrathin battery that is not thicker than 1 mm and preferably thinner than 1 mm.
- the inductive power receiver or ultrathin battery 2506 (and which may not have been drawn to scale) is integrated in the housing 2502 .
- the display is provided with a communication device 2507 that communicates with a corresponding communication device 2508 in a portable computing device 2503 that includes at least one lens/sensor unit 2509 .
- the computing device 2503 is a mobile phone or smart phone.
- computing device 2503 is a stand-alone camera, preferably a digital camera.
- a portable and mobile computing device such as 2503 and a portable and mobile display such as an ut/ul display 2501 even though separate form a system
- the computing device generates at least one signal that is strictly designated to be processed by the display and displayed by a screen of the display.
- the display is then uniquely configured and/or authorized to correctly process such a signal to create an appropriate image.
- a configuration or authorization can be achieved by encoding or encrypting the signal by the computing device and only provide an authorized display with the corresponding decryption or descrambling circuitry or program.
- the signal is modulated in a unique way, for instance by frequency or time-slot hopping, which can only be demodulated in an authorized way by the display or its embedded receiver.
- a unique and authorized connection can also be achieved by using a single and exclusive wired connection between the computing device and the display wherein the computing device is only connected with the display.
- the above embodiment clearly excludes any broadcasting application for connection to a plurality of displays.
- a computing device connects in real-time with a plurality of displays, which may be ut/ul displays to provide a real-time image generated by the computing.
- a real-time image is a panoramic image or a stereoscopic image generated in accordance with one or more aspects of the present invention.
- a system contains at least one camera, such as a device 2503 and a plurality of displays 2501 which are configured and authorized to process a signal generated by the camera to display the image.
- each display contains a processor that is enabled to be programmed to be authorized to process the image signal generated by the camera.
- a coding processor generates a code or a program that is applied to a processor in a display to become authorized to process the image signal generated by the camera.
- the image signal generated by the camera is distributed by a network to a display, which may be the Internet or a mobile telephone network.
- a stand-alone camera in this case is defined as a portable device that is strictly limited to taking or recording images, which may be still images in one embodiment and video images in another embodiment of the present invention.
- a digital camera is a camera with at least one lens/image sensor combination.
- An example of a stand-alone digital camera is an DSRL (digital single lens reflex) digital camera or any other digital camera for consumer or professional use.
- a stand-alone camera is not part of any other computing device such as a smart-phone or a mobile phone.
- a stand-alone camera has real-time communications means such as a wireless WiFi connection or Bluetooth or real-time wired or wireless USB.
- This communication means in case of the stand-alone camera is mainly used to receive instructions and or image data and to transmit status date and/or image data.
- the communication means is exclusively used to receive instructions and or image data and to transmit status date and/or image data.
- a stand-alone camera is used to also record sound related to a scene.
- the communication means is also used to transmit and receive instructions and data related to an audio signal.
- a stand-alone camera in a further embodiment of the present invention has processing capabilities, for instance in image processing or other tasks to generate image data that represents a desirable image that may be provided with an audio signal. It is believed that a stand-alone camera that is connectable to a display as disclosed herein is novel.
- a stand-alone camera is connected to an ultra-thin, ultra-light display via a wired connection.
- the wired connection also provides power to the ultra-thin, ultra-light display.
- a system containing a stand-alone camera with a processor and memory and a connection to an ut/ul display to display in real-time an image being recorded by a lens/sensor unit in the stand-alone camera.
- the stand alone camera contains two lens/sensor units.
- the stand-alone camera and the ut/ul display as provided herein are in one embodiment connected via a wire and in a further embodiment they are connected via a wireless connection.
- a camera may not cover a desired field of view.
- a computing device stand 3900 as illustrated in FIG. 37 is provided with a platform 3901 enabled to hold the computing device 3902 containing at least one lens/sensor unit, the computing device having at least a wired or wireless communication device to communicate with a display 4000 illustrated in FIG. 38 with a screen 4001 which may be an ut/ul display.
- the computing device 3902 has an antenna 3903 and the display has an antenna 4003 .
- the stand 3900 further has a rotatable or movable platform 3905 .
- the platform 3905 has at least one remotely controlled actuator such as an electrical motor, which may be a stepping motor or any other remotely controllable motor or actuator, which enables the platform 3901 to be rotated in a remotely controlled fashion.
- the actuator is either bi-directional based on a control signal allowing it to move in a direction and a reverse direction, or an actuator contains at least two actuators, one working in a direction and the at least other one working in the reverse direction.
- the platform itself has a receiver to receive a control signal for an actuator in the platform, for instance from the computing device or from the display.
- a rotating control signal is provided via computing device or camera 3902 to a motor or actuator controller in the platform.
- both the platform and the computing device as well as the display are Bluetooth enabled.
- the platform for instance device holder 4102 has a connector 4108 to make a connection with device 4101 as illustrated in FIG. 39 , wherein an actuator controller in the platform receives a control signal via 4101 .
- a remote controller a device holder attached to a platform which in one embodiment contains a computing device with a lens/sensor unit is instructed from a computer display to follow an object.
- a computing device on a platform moves substantially in one plane.
- a computing device on a remotely controlled platform is also enabled to move in a vertical plane, for instance by using a vertical actuator such as a telescoping actuator 3906 , or any other actuator that can change the orientation of the platform.
- a specific connector such as 4108 in one embodiment of the present invention requires that a device and a device holder on the platform having matching or corresponding features, and allows to create a platform or at least a device holder of the platform that is proprietary to a certain device.
- a remote controller 4002 which may be a joystick, is incorporated in the display.
- one or more sensors such as accelerometers 4004 are attached to the display or are incorporated in the display.
- accelerometers 4004 are attached to the display or are incorporated in the display.
- Such a display with accelerometers is disclosed in U.S. Pat. No. 7,688,306 to Wehrenberg et al. issued on Mar. 30, 2010 which is incorporated herein by reference.
- the purpose of the sensors is to determine a position or a change of position of the display. Based on the detected change in position a signal which is associated with the difference of position of the display is generated and is used to control or drive a motor or an actuator in the platform.
- Positional sensors and positional difference sensors are known and include accelerometers, magnetic sensors, MEMS with mechanical components, gyroscopes such as optical fiber gyroscopes, Hall sensors, inertial sensors, vibrational sensors and any other sensor that can be applied to determine a positional or angular difference or angular rate change or angular velocity or a linear rate change or linear velocity change of the display. These sensors will be called a positional sensor herein. Many of these positional sensors are available in very small formats, for instance integrated on one or more chips, which can easily be fitted inside a display, even in an ut/ul display. A change in position or angle of the display causes the actuator in the platform to be activated.
- the actuator in the platform is activated based on the amount of change in the display from a neutral position. To stop the actuator from being activated the display has to be returned to its neutral position. A larger change in position by the display from its neutral position can create a faster activity by the actuator. In a further embodiment of the present invention, activity of the actuator depends on the rate of change of the position of the display. Also because the sensors are directionally sensitive and the actuator is, in a further embodiment of the present invention, a control element such as a button is provided on the display, which allows the actuator to respond or stop responding to a change of position of the display. This allows a user to return the display to a neutral position without activating the actuator.
- the display has a control element, such as a button, to reverse the moving direction in a substantially horizontal plane based on a movement of the display.
- a control element such as a button
- the platform When a display is moved from left to right the platform also moves from left to right. This is useful when a user of the display and the camera in the device holder face in the same direction.
- Activating the control again reverses the direction of the actuator as a result of a movement of the display.
- a user confirms in a certain position a zero position which corresponds with a zero position of a computing device.
- the user can cause the movement of the platform and thus of the computing device with a camera and track an object or pan an area.
- a fixed tracking plane for the platform can be set or programmed.
- the platform can be provided with a third movement, for instance in a vertical plane.
- the display can be one of a ut/ul display, a portable computer display, a tablet PC, a mobile phone, a smart phone, a PDA (personal digital assistant) or any other device including a display screen, at least one accelerometer, and a communication device to connect to a remote camera and a remote motor controller which is preferably a wireless connection but can also be a wired connection.
- a display has a processor that is programmed to instruct a motor controller to follow a pre-programmed track.
- a tracking path of the platform is recorded and can be repeated by the display or the computing system.
- the remotely controlled platform 4100 for a computing device for instance including a small camera can also be small as is illustrated in FIG. 39 .
- a movable platform has a bracket or fixture 4105 that can be fixed, for instance temporarily, against or on an object or a wall.
- Fixed on the bracket or structure is a motor or actuator holder 4104 which contains the motor or actuator controller and the motor or actuator that can move an axis or arm 4103 in a first plane which can be a horizontal plane.
- a receiver to receive actuator control instructions from the display.
- a camera/computing device holder 4102 attached to the arm is enabled to receive and tightly hold a computing device 4101 .
- a motor or actuator 4106 is attached between 4102 and 4103 , enabling the holder 4102 to be rotated in a different plane for instance perpendicular to the first plane.
- yet another actuator is included in the platform that allows the holder 4102 to be rotated in a plane perpendicular to the horizontal plane in which arm 4103 can move and perpendicular to arm 4103 . This allows in effect a rotation of the field of view of the camera in computing device 4101 and allows for a remote correction of the horizon in the image taken by the camera of 4101 if the orientation of the platform is not aligned with an orientation of an object.
- FIG. 40 illustrates the platform viewed from above.
- the actuators in one embodiment of the present invention are provided with electrical power.
- a power source such as a battery, in one embodiment of the present invention is included in the platform.
- a battery in the computing device powers the platform, for instance via a power connection.
- power is provided from an external source, for instance via a power cord.
- a platform 4100 is fixed against a wall, a structure or an object with a removable adhesive such as Scotch® WallsaverTM Removable Mounting Square of 3M Corporation.
- a removable adhesive such as Scotch® WallsaverTM Removable Mounting Square of 3M Corporation.
- Other attachment means such as magnets, Velcro® fasteners, pins, screws, straps, clips or any other fastening mechanism that allows the platform to be fixed to an object or structure are fully contemplated.
- This allows a computing device with a camera to be positioned, preferably in a removable manner on a non-intrusive location, for instance during a performance and allow a user to pan, track, focus and zoom on part of a scene or a performance.
- a stand like a portable camera stand is provided.
- a camera remote control is provided, which may be incorporated in the display.
- a touch type screen in the display in one embodiment of the present invention is enabled to display a camera control to control one of camera focus, camera zoom in/out, camera aperture, camera on/off and any other camera control.
- a confirmation signal from the display keeps the camera platform and/or the camera in its preferred settings.
- a screen 4300 contains an image of a person 4302 recorded by the camera in the computing device on the platform.
- a user provides an identifier 4301 , such as an ellipse, around the image on a screen of the display and a processor in the computing device or the display is trained to learn characteristics of the object. This may be a segmentation learning, a statistical distribution learning, specific characteristics learning such as head and legs, or detection of a provided characteristic such as a colored tape or patch on the object or any other object image learning that allows an object to be detected in an image.
- the processor is enable to recognize or track the object or person in a scene.
- the processor in one embodiment determines a tracking center 4303 of the object.
- a user may provide a tracking box 4303 on the screen which determines a part of the screen wherein the object's or person's image has to appear on the screen. For instance one may want to have the object appear at the right side of the screen.
- the size of the box 4303 determines the activity of the platform. If a person moves inside an area corresponding to 4303 the platform will not move, creating a relatively stable screen image. However, when the person or object moves outside the area corresponding to the box the platform will move.
- the location of for instance the tracking center 4304 determines the control signals provided via the processor to the motor controls of the platform.
- Image tracking is well known and is disclosed for instance in U.S.
- a processor in one embodiment of the present invention is provided that is programmed to track a moving object from an image generated by a lens/sensor unit on the mobile platform. Based on the location of a tracking point in the image of the object a processor controls a motor or actuator in the mobile platform that allows a lens/sensor unit in a device held on the platform to keep the object in its field of vision.
- the mobile platform is compact and light.
- a holder of the mobile platform is about the size of a mobile phone, but made of light strips or a web of rigid or semi-rigid material.
- the weight is preferably less than 500 grams, more preferably less than 250 grams and most preferably less than 200 grams.
- the mobile platform should be mobile and portable. Its size in should be able to fit in a box of a volume preferably less than 1000 cm 3 in volume, more preferably about or less than 500 cm 3 and most preferably with a volume of about or less than 250 cm 3 .
- the computing device 2503 has no lens/sensor unit and is a router or a router like device.
- the display 2501 has at least one control 2505 to control the display screen, including but not limited to an on/off switch, a contrast control, a brightness control and the like.
- the control 2505 is a touch screen that is either separate or part of screen 2504 .
- the display 2501 also contains control logic and memory 2512 .
- the device 2512 may also contain a processor for processing image data that is to be displayed.
- device 2507 receives a signal that contains image data.
- Device 2507 which may be a wireless device such as Bluetooth device or a WiFi device, or with a wired connection such as a USB port, can demodulate the received signal, for instance from 2508 , into for instance a base-band signal that represents a displayable image that is provided to 2512 where it may be stored, if necessary only temporarily, to be processed and to be displayed on the screen 2504 .
- a demodulator demodulates the signal and an extractor extracts from the demodulated signal the image data.
- the image data is arranged in a frame structure that allows the image data to be displayed on the display screen.
- 2512 may contain a mass storage chip to store a series of images, such as consecutive video images.
- display 2501 also contains a power inlet 2511 that allows it to be connected to a power outlet, for instance to a power outlet 2514 on device 2503 to power the display screen 2501 .
- the power connection between 2511 and 2514 includes a communication connection and may be a USB connection or a WiFi connection. This allows the display to be powered by 2503 while also taking care of the communication that otherwise would be taken care of by 2507 and 2508 .
- the computing device 2503 is a very small device that fits easily in a clinched first of a mature person.
- a device may have measurements of about 8*3*2 cm 3 in volume such as for instance the Samsung® SCH-U470 model mobile phone or smaller.
- the device has a screen for display of about 3*2 cm, or larger or smaller which is adequate for viewing information, but of course provides only a limited quality view of images. The advantage of such a small device or even smaller, is that it is extremely portable.
- a display 2501 that shows images taken by 2509 of 2503 in real time allows the display 2501 to be used as a real-time viewer.
- At least one lens/sensor unit 2509 on a device 2503 records an image, which may be a video image, and provides the related image data to a communication device 2508 which transmits it to a receiving device 2507 of display 2501 to be processed by 2512 to be displayed in real-time or close to real-time on screen 2504 .
- device 2503 has at least 2 lens/sensor units from which device 2503 creates a 3D or a panoramic image which is provided for transmission on 2508 .
- the screen 2504 may be a 3D enabled screen to display 3D images.
- a computing device is connected to a single display wherein a screen 2504 of the connectable ultrathin, ultralight display is at least 1.5 the size in area and preferably at least 2 times the size in area of a display screen that is an integrated part of the computing device.
- the connectable ultralight ultrathin (ut/ul) display is preferably at least 5 times, more preferably at least 10 times, yet more preferably at least 20 times, yet more preferably at least 40 times and most preferably at least 100 times the size of the integrated display of the computing device.
- the screen 2504 works as a real-time viewer of lens/sensor unit 2509 .
- a camera unit creates and stores an image.
- Such an image can be transferred to another device in the form of a file, including a RAW file.
- the screen 2504 shows the same image, which may be a video image as one would see on a viewer on the device.
- a camera viewer on the computing device in one embodiment is significantly smaller than the screen 2504 . This may mean that screen 2504 can show an image in a higher resolution than the integrated viewer on the computing device. This issue can be addressed in several ways.
- the computing device recognizes the device 2501 and the display screen requirements.
- This may include the type of display screen and the signals formats required to display an image correctly on the screen 2504 .
- a screen 2504 can show four times as many pixel lines than the integrated viewer.
- This recognition may be achieved for instance by an identifying code that is transmitted by device 2501 during initiation of a connection and that is recognized by 2503 .
- Device 2503 may have a memory that stores different display protocols for a possible connected screen 2504 .
- an image processing application is activated that processes image data generated by 2509 into the required format required by 2504 .
- that processed data is transmitted as a signal to the display 2501 for display on 2504 .
- the signal is provided to a driver of the screen 2504 to generate the image on the screen.
- the signal is a modulated signal, such as an RF (radio frequency) or infra-red signal.
- a wired connection may also require a modulated signal.
- the display 2501 has a receiver to receive the signal, and to condition the signal for demodulation, demodulates the signal, extracts the image data and processes it to be provided in the required format for the driver and provides the image data to the driver to drive the screen to show the image. Intermediate steps may be required to enhance a signal and to remove extraneous data that were used for correct communications, including but not limited to handshake data and data error correction data.
- the device 2503 transmits image data to 2501 in a standard format such as JPEG or RAW.
- the device 2503 has a processor that is programmed to transform the image data from its standard format to the format required by a screen 2504 and its related driver. While this is a possible embodiment of the present invention it is a less preferred embodiment with a small and low power processor on the device 2501 that can achieve this. It is preferred to have most power intensive processing being done by the device 2503 and limit power intensive processing on device 2501 .
- the device 2503 stores at least one image or a video image.
- the device 2503 is enabled to play for review stored images.
- the device 2503 provides stored images for review on 2508 to be transmitted to 2507 to be reviewed on 2504 .
- the display is a self contained and portable display that works with a computing device that includes a camera with at least one sensor/lens unit. Users already carry many devices, which may include a cellphone, a laptop, an MP3 player, a digital camera and the like. To carry an additional device, such as a display 2501 requires the display to provide desirable advantages while avoiding substantial disadvantages. It is believed that an embodiment of a connectable display with a sturdy and rigid housing that has a very thin thickness with a large screen and a low weight offers a substantial advantage that is currently not available and has also not been articulated as such. It is believed that an embodiment of a connectable display that has a very thin thickness with a large screen and very low weight is a substantial advantage.
- an embodiment of the present invention that provides a separate camera with a smaller display screen and a separate connectable display with a large display screen offers a substantial advantage. Furthermore, the ability to view large format images is believed to be a very substantial advantage, for people with normal vision and for people with impaired vision alike. Furthermore, the ability to view large format images generated in real-time by a camera on a separate display screen is believed to be a substantial advantage. Furthermore an embodiment of the present invention enabled to show images in full color is believed to be an advantage. Furthermore, an embodiment of the present invention to show video or video-like images in real time on the display screen is believed to be an advantage.
- Seiko Epson reported an eReader in a 18 cm by 12 cm housing of 3 mm thick of 57 gram in 2007 at Embedded Technology 2007, an exhibition event in the Pacifico Yokohama convention center.
- the device has a USB port.
- the device displays grey color and not video enabled images as was reportedly has a refresh of 0.8 sec. It was also not disclosed how data exchange was used. It appears that it supports document type transfer.
- a display screen that meets the requirements as an aspect of the current invention would be like a magazine or a paper notepad for instance of A4 format or larger.
- Such a notepad is extremely portable of light weight and only noticeable by its size, not by its weight or bulk and is very transportable without being intrusive.
- the computing device connects directly with the display. This means that the computing device generates and transmits a signal that is received and processed by the display. No intermediate signal processing between the two apparatuses is required in case of wireless transmission.
- a wired transmission only a single wire between the computing device and the display is used for a direct connection.
- a direct connection is autonomous and no other devices than the computing device and the display in a wireless connection are required.
- the display and the wire are required.
- the wire may contain circuitry, such as amplification circuitry. However, the wire is also portable and mobile, so that the computing device is enabled to be connected to the display with the wire without any further supporting infrastructure.
- the upper limit of weight for an external display to be carried along is about 1 pound, with a screen size of at least 10 by 15 cm, preferably 15 by 23 cm and more preferably 8 by 11 inches and yet more preferably greater than 8 by 11 inches.
- the upper weight limit for all the preferred screen sizes is about 1 pound, more preferably 12 ounces and even more preferably 8 ounces or less. Even more preferable is a weight of 6 ounces.
- a weight of 16 ounces or less is preferable, a weight of 12 ounces or less is more preferable and a weight of 6-8 ounces is most preferable.
- a weight substantially less than 6-8 ounces and certainly less than 6 ounces is also preferable.
- a low weight is preferable for a portable display but a very light weight display weighing less than 6 ounces appears to lose a sense of substance.
- An even smaller size, such as a half postcard sized display screen size of about 7 cm by 4.5 cm is desirable and even more preferable if it is thinner than 5 mm and light, for instance 12 ounces or lighter or even more preferable 8 ounces or lighter.
- a larger size with low weight is also preferable.
- a minimum screen size of substantially or about 20 cm by 15 cm for a portable display is more preferable, while a screen size of a notebook or writing pad size (standard 8 by 11.5 inches or A4 in Europe) appears to be very preferable. Larger sizes at low weight are also preferred but less so for a portable display as it starts to become too large for being carried in carry-on luggage.
- a large screen as an aspect of the present invention is fully color enabled. In a further embodiment of the present invention it is also video enabled with a display rate of at least 12 fps.
- Sizes as provided herein are sizes equivalent to an area that is defined by an area of a rectangle with one side L and one side W. The area is then L*W. So if a size is preferably larger than 15 cm by 10 cm then it means that a size is preferably larger than 150 cm 2 .
- a display has a thickness of 1 cm or less. More preferable is a display that has a thickness of 7.5 mm or less. Even more preferable is a display that has a thickness of 7 mm or less. Even more preferable is a display that has a thickness of 6 mm or less. Even more preferable is a display that has a thickness of 5 mm or less. Even more preferable is a display that has a thickness that is 4 mm or less. Even more preferable is a display that has a thickness of 3 mm or less.
- a thickness of 2 mm or less is also preferred. However, there is a requirement also of rigidity and sturdiness. A thickness of 2 mm or less is only preferred if the housing and the whole display maintains a feel of sturdiness and stiffness and rigidity. There is also a matter of cost to create the display to be considered.
- a display is a separate but connectable ultra thin/ultra-light display (UT/UL or ut/ul display).
- a ut/ul display is a portable external accessory to a computing device, like earphones.
- Current devices and displays are not ut/ul displays.
- First of all the displays are not separate from the computing devices.
- the Apple iPAD® for instance has a size of 242 by 189 by 13.4 mm with a weight of about 700 grams and clearly does not meet the requirements of a ut/ul display.
- the Apple iPOD® for instance has a size of about 110 by 62 by 8.5 mm with a weight of about 115 grams, wherein the actual screen is substantially smaller than the size of the device, and also does not meet at least the thickness requirements of an ut/ul display.
- a thickness of at most 4.5 mm in a box shape display, will still meet the requirements of an ut/ul display.
- the screen and screen backplane Part of the display is being formed by the screen and screen backplane, the housing of the display and the battery or power source (if used).
- the screen itself can be thin if an OLED or an electrophoretic display screen are used.
- the thickness of ePaper or elnk display screens with backplane has been reported to be in a range of 250 micron to 850 micron. Accordingly, display screens of large areas complying with the requirements herein are available with a thickness of about 500 micron (or micro-meter).
- Full color and video imaging is enabled by an electrophoretic screen using electrowetting. This is disclosed in for instance US Patent Publication Ser. No. 20100128015 to Feenstra et al. published on May 27, 2010, which is incorporated herein by reference in its entirety. Such screens have been demonstrated by Liquavista of Eindhoven in the Netherlands.
- the body of the display should be strong but also ultra thin.
- One way to achieve this is to use high-strength and high stiffness polymers.
- a polycarbonate material such as disclosed in U.S. Pat. No. 6,043,310 to Liu et al. issued on Mar. 28, 2000 which is incorporated herein by reference.
- One may use a housing with a maximum thickness of 5 mm or less.
- one may use a housing with a maximum thickness of about 3 mm or less.
- one may use a housing of a maximum thickness of about 2 mm or less.
- one may use high strength polymer composite materials that have been reinforced by fiber materials. These composite materials are well known.
- carbon fiber reinforced composites are used.
- a carbon reinforced polyimide resin is used to form a thin but strong housing.
- the housing is made of carbon nanotubes reinforced composites. Accordingly, a stiff housing for a display can be made with a thickness of less than 2 mm thick; of less than 3 mm thick; of less than 4 mm thick; and of less than 5 mm thick.
- the supporting electronic devices for communication, power management and screen display management are embedded in electronic integrated circuits. These integrated circuits are embedded in the plastic housing of the ut/ul display and provided with the required connections. How to embed integrated circuits into the plastic housing is known and is for instance disclosed in the article “Embedding of thinned chips in plastic substrates” by Vandecasteele et al. of IMEC-TFCG Microsystems in Zwijnaarde, Belgium as downloaded from the Internet and which is incorporated herein by reference.
- the components such as chips and connectors can be fixed in or to the shell or the foam in different ways.
- foam or shell are partially ablated for instance by a laser such as a pulsed laser to create a shape in the shell or the foam that will receive the relevant components, which can then be placed and fixed into place, for instance by a glue bonding.
- FIG. 26 illustrates one embodiment of an ut/ul display 2600 both in front view 2600 a and a cross sectional side view 2600 b .
- the display has a body or housing 2601 with a thin screen 2602 as was discussed above.
- an at least one control element 2606 which may be an on/off switch.
- the ut/ul display has a connector 2603 for connecting an external power source.
- at least one embedded IC or chip 2607 is shown in the cross sectional view.
- Chip 2607 is a further embodiment is representative for at least a power control chip, a display control chip (including a driver), and any other ICs that are required to enable the display.
- a device 2605 which is included in the housing in an embodiment represents a communication device, including an antenna in a further embodiment, to receive and process signals from a computing device that transmits signals representing an image. These signals are processed into further signals that can be processed by 2607 to be provided to 2602 for display of an image, which can be a video image.
- image signals may be provided through a wired connection.
- Device 2605 in such an embodiment can be configured to connect to a wire to receive the required signal.
- a wire may be an electrical wire or an optical wire such as an optical fiber.
- 2603 is for instance a USB port enabled to transmit power as well as signals such as an image signal.
- a battery causes much of the bulk and weight of a computing device.
- screen 2602 is a low power consumption screen as is known in the art.
- an ultrathin battery 2604 which has a thickness of 1 mm or less and preferably of 0.5 mm is applied.
- Very thin Li batteries using Lithium Phosphorus Oxynitride (LiPON) as electrolyte in a FET configuration are known.
- Ultra-thin batteries are disclosed in U.S. Pat. No. 7,389,580 to Jenson et al. issued on Jun. 28, 2008 which is incorporated herein by reference.
- power is provided by inductive transmission as discussed above and 2604 represents such an inductive power receiver.
- the housing serves to hold the screen and all the supporting electronics and provide stiffness to the display. It also serves to provide a user with the ability to hold the display at the edges.
- the edges of the housing are tapered as shown in FIG. 27 by 2701 . This provides a user with a sense of thinness and saves in weight.
- a portable display has a screen that is larger than a screen of the computing device it is connected to.
- the display has a thickness that is not greater than 5 mm. In a further embodiment it has a thickness that is not greater than 7 mm. In yet a further embodiment it has a thickness that is not greater than 1 cm. In yet a further embodiment it has a weight that is less than 12 ounces. Because the thickness and weight are not as restricted as in the above disclosed embodiments, one may use cheaper technology including batteries and connections and display screens and the like. It is currently possible to create very small and compact computing devices with camera capabilities, and with integrated storage, processing, rechargeable power and communication capabilities. Such devices can be fitted with a small display screen to provide minimal status information and with limited input controls.
- a portable and connectable display is provided that has a screen size that is at least larger than the integrated display screen on the computing device.
- a screen of such a display device is provided with touch screen capabilities, allowing it to be operated as an input device to a separate computing device.
- the computing device is enabled to work independently from an external display. It has its own input controls and a display screen.
- an external portable display is provided that can be connected to the computing device.
- the display has a screen that is at least partially or in its entirety a touch screen.
- the display is connectable to the computing device either wirelessly, for instance by using a Bluetooth, WiFi or other wireless communication connection. In such a case the connectable display should be provided with its own power source, which may be a battery.
- the display is connectable with the computing device by wire, for instance with a USB connection. In that case the display may have its own power source or it may be powered by a power source that is provided to the computing device.
- a connection is a real-time connection.
- the display screen displays an interface, an image or a document as generated or as being processed by the computing device and not as a completed and stored document.
- Potential delays or interruption in transmission of data between computing device and display can be addressed by applying a buffer at the display in one embodiment of the present invention and at the computing device in another embodiment of the present invention and in both the computing device and the display in yet another embodiment of the present invention. This allows to temporarily store data that is to be displayed but has not been done so.
- the display may receive data from the computing device to be displayed on the screen, but the data does not establish a complete frame.
- a communication protocol such as a handshake protocol, may inform the device that data is missing. The missing data will remain in a buffer on the computing device until the display confirms receipt of the data.
- Data buffering techniques between a transmitting device and a receiving device are well known in the art.
- the size of the screen of the connectable ut/ul display is at least larger in area than the screen of the computing device.
- the ut/ul screen is larger in area than 7 cm by 5 cm; more preferably the ut/ul screen is larger in area than 10 cm by 7 cm; even more preferably the ut/ul screen is larger in area than 15 cm by 10 cm; even more preferable the ut/ul screen in area is larger than 25 cm by 19 cm; even more preferable the ut/ul screen in area is larger than 30 cm by 20 cm; even more preferable the ut/ul screen in area is larger than 30 cm by 25 cm.
- Such a device is portable and usable as a phone which may be wireless, MP3 player, camera (video, and/or panoramic and/or 3D), video viewer, data viewer, audio and video recorder data storage, data recorder and in any other form for which a computing device is and can be configured.
- a computing device may be wireless, MP3 player, camera (video, and/or panoramic and/or 3D), video viewer, data viewer, audio and video recorder data storage, data recorder and in any other form for which a computing device is and can be configured.
- viewing of data, documents and images is not as convenient or useful as with a large display screen.
- computer applications such as word processing, spreadsheets, presentations, image processing and the like can be stored on a small footprint device and can be processed by a processor on the device but cannot be viewed and/or displayed adequately.
- the use of certain computer applications is very much enabled when an external ut/ul display is connected.
- One apparent advantage of using an ut/ul display together with a separate computing device with a lens/sensor unit is the ability to view an image during recording without requiring to view a scene through the image viewer on the computing device.
- the weight of the computing device is also limited. A large integrated screen in the computing device will add to the weight. So will a large battery.
- the computing device has a weight that does not exceed 250 grams. In a more preferred embodiment of the present invention the computing device has a weight that does not exceed 200 grams. In yet a more preferred embodiment of the present invention the computing device has a weight that does not exceed 150 grams. In yet a more preferred embodiment of the present invention the computing device has a weight that does not exceed 100 grams.
- the weight of the computing device is diminished by applying a thin battery, not exceeding 5 mm in thickness. In one embodiment of the present invention the weight of the computing device is diminished by applying a thin battery, not exceeding 3 mm in thickness.
- a computing device recognizes when an external display is connected to it.
- Such a computing device may for instance use a simple, menu driven interface to a user on its own integrated screen.
- a computing device can be under control of an operating system.
- Such an operating system in one embodiment of the present invention is operational on a processor in a small footprint computing device.
- the operating system is programmed to recognize the connection of an external display device.
- Such recognition capability is for instance known for USB devices.
- an external ut/ul display When an external ut/ul display is recognized the operating system through the processor generates a graphical user interface (GUI) for the external display screen, while the integrated display screen on the computing device maintains its own interface.
- GUI graphical user interface
- the GUI on the external display screen may be controlled by a control on the computing device.
- the external ut/ul display may also have its own user control, which may be a touch screen which communicates through the connection with the processor on the computing device.
- the external ut/ul display device depends for most of its processing capabilities and storage capabilities on the computing device.
- the external display in such an embodiment has capabilities to communicate with the computing device. It may have some processing power to generate appropriate graphics, images and characters. This may include screen drivers, and some control circuitry, power circuitry and some local memory to hold and perhaps recall images or GUI interfaces as well as communication circuitry to communicate with the computing device.
- all actual processing and generation of data, images and the like takes place on the small footprint computing device. This means that only a minimum of circuitry and supporting connectivity, storage and power circuitry is located on the external ut/ul display and limits its thickness and weight.
- one may include additional processing circuitry and memory on the external ut/ul display, allowing to process some data for display and cache data for future use.
- One may also include processing capabilities to locally generate and hold a GUI on the external display. This means that for instance an external display may be disconnected from the computing device without losing the data or screen information that is being displayed.
- the computing device in one embodiment of the present invention has wireless phone and other capabilities such as camera capabilities and/or document processing capabilities.
- the wireless phone capabilities can be independently switched on and/or off without switching on/off the other capabilities.
- the screen is of a connectable ut/ul display is a screen with touch screen capabilities. This allows the screen to support full video in color while also having touch screen capabilities.
- a light weight display with touch screen capability has been disclosed in US Patent Application 20100171708 to Chuang published on Jul. 8, 2010 which is incorporated herein by reference. It is to be understood that an embodiment of a ut/ul display that uses touch screen technology and/or a control that communicates with a processor on the computing device is enabled to receive and process signals from the computing device and to send signals to the computing device.
- FIG. 28 is a diagram of an embodiment of the present invention. It shows a portable and connectable display 2804 with a screen 2805 .
- the display 2804 has a connecting device 2803 which allows the display 2804 to be connected with a connecting device 2802 of a computing device 2801 which preferably is a small footprint computing device.
- the computing device 2801 in one embodiment has at least one lens/sensor unit 2807 .
- computing device 2801 in one embodiment has at least two lens/sensor units for creating panoramic and/or 3D images as disclosed herein. In one embodiment these panoramic and/or 3D images are created by a processor on 2801 and are transmitted to portable display 2804 to be displayed on screen 2805 .
- the constituting images are being generated by the at least two lens/sensor units on 2801 and are provided with control data that allows a processor on 2804 to combine the images to form a panoramic and/or 3D image by applying the control data and to display the panoramic and/or 3D image on screen 2805 .
- the portable display 2804 has a control 2806 which may be an on/off switch or one or more controls to control screen functionalities such as brightness, contrast and the like, and which may also contain a mouse or pointer like control to control for instance a cursor or pointer type graphic on the screen 2805 .
- the computing device has at least 2 lens/sensor units of which one, which is 2807 is shown facing the viewer of the FIG. 28 and at least one facing the opposite site. This allows the device 2801 to record images in the direction of a user and an image in the viewing direction of the user.
- a lens/sensor unit in the above embodiment is at least a plurality of lens/sensor units that allows to creating panoramic and/or 3D images in each direction.
- FIG. 29 shows a diagram of one embodiment wherein the computing device 2801 has a slot 2901 which contains connecting device 2902 which can receive and clamp to an edge of 2804 over the connecting device 2803 to create a single device consisting of 2801 and 2802 .
- the computing device 2801 with slot 2901 is shown in side view in FIG. 29 .
- a lens of lens/sensor unit 2807 is shown in the side view.
- a lens 2903 in an opposing direction is also shown. It is to be understood that in one embodiment of the present invention that either 2807 or 2903 or both 2807 and 2903 are a plurality of lens/sensor units.
- FIG. 29 also shows a diagram of the combined devices 2801 and 2804 .
- This allows 2900 to act as a complete portable computer. However 2900 can easily be separated again in 2801 and 2804 as separate devices.
- Combined device can act as a camera that applies screen 2805 as a direct viewer of either lens/sensor unit 2807 or 2903 .
- combined device 2900 records and shows images taken both by 2807 and 2903 at the same time on 2805 . This may be done by displaying each respective image in a separate window on 2805 .
- One may also show at least one image which may be a video image received by the phone device in device 2801 from a network, which may be the Internet on 2805 .
- One may also receive a plurality of different images at the same time through the phone device.
- at least one window on the screen may be a window related to an application that is being processed by a processor on 2801 .
- This window may also be transmitted through the phone device on 2801 over the network which may be the internet to one or more recipients.
- 2900 can be used as a portable teleconferencing device. It is to be understood that the functionality of 2900 can also be performed by 2804 and 2801 when not combined but in communication through connecting devices 2802 and 2803 .
- 2801 in FIG. 29 is shown being clamped on 2804 on the left edge that other configurations are possible, including the computing device being clamped on the right edge, on the top edge, one the bottom edge and on the back side of the display 2804 .
- FIG. 30 shows in side view diagram one embodiment of a computing device 3001 that can be clamped onto a connectable display 3004 able to connect with computing device 3001 .
- Computing device 3001 has, as was disclosed above, a slot 2901 which can receive the side of the display 3004 .
- computing device 3001 has a connector 2902 which was also disclosed above and display 3004 has a connector 3005 .
- Connectors 2902 and 3005 in one embodiment are matching male/female connectors that can be connected in a unique way.
- 2901 and 3005 are matching USB connectors. This allows the computing device 3001 to power the display 3004 .
- the power to a display is exclusively provided externally and the display does not have a battery.
- the slot 2901 may be equipped with springs 3003 which further secure the display in the slot.
- Other means to secure the display in a slot, including alignment slots are fully contemplated. Side clamping with a slot has been provided as an illustrated example.
- Other means for connecting and fixating a computing device 3001 to a display 3004 are fully contemplated.
- the display may receive the body of the computing device on the blind side.
- the computing device may have a hook shaped connector that hooks over and edge of the display housing, with a connector inside the hook shape of the computing device and a connector on the edge of the display.
- 3001 and 3004 exchange data via a wireless connection.
- the computing device and the connectable ut/ul display are two different devices that can be connected and/or combined.
- One way to not exclusively describe a combination of a computing device and a display is by size of their screens, wherein the screen of the display is larger than of the device it connects to.
- Another way to not exclusively describe an aspect of the invention is that no dimension in size of the computing device matches or is equal or substantially equal to the dimensions of the display with the condition that the screen of the display is larger than the screen of the computing device.
- the computing device housing has a length Lc (its longest dimension) a width Wc (its width) and a thickness Tc wherein all dimensions refer to the housings. These can be referred to as (Lc,Wc,Tc).
- a display in a similar way, has housing dimensions (Ld, Wd, Td).
- none of the corresponding dimensions (Lc,Wc,Tc) and (Ld, Wd, Td) are equal (Lc ⁇ Ld) etc.
- none of the dimensions (Lc,Wc,Tc) are equal to any of the dimensions (Ld, Wd, Td).
- the display can connect preferably wirelessly with any computing device, including communication type devices such as routers to receive a signal with image data to be displayed, that is authorized.
- an authorization may be established by an authorization code or a way of coding or modulation as is known to one of ordinary skill in the art that allows an apparatus to be identified to be authorized.
- the display is enabled to send a probing signal wirelessly that can be recognized by a computing device.
- the computing device is programmed to recognize a specific signal, associate the specific signal with a specific authorization and transmits data, including image data to the display in response to a recognized probing signal.
- the process starts with a computing device sending a signal that can be received and interpreted by the display. This signal tells the display that a computing device is present and ready to receive a probing signal from the display. This allows a display to be traced by a computing device when it is being moved and keep providing the display with appropriate data, which may include image data.
- a plurality of displays are static in location and placed in different positions and a first computing device is mobile, with a moving user for instance.
- a user may for instance be watching a video program that is stored on a second computing device on a first display and that is provided over a network to the first display. Relevant data about the video program is provided and stored on the first computing device.
- the user may move to a different location with a second display.
- the first computing device in communication with the second display communicates the presence of the user and provides the second display with the required data for the second display to show the program that was being watched on the first display with the correct timing and status data. This allows a user to move around a house and follow a video program continuously on different displays at different locations.
- the large connectable ultra-thin display is provided with a lens/sensor unit.
- the display has a thickness of 7 mm or less and has a weight of less than 16 ounces, preferably a weight of 12 ounces or less and a size of about 25 cm by 20 cm or larger.
- the lens/sensor unit can be a lens/sensor unit as disclosed by for instance Aptina Imaging of San Jose, Calif. in a wafer level camera form and with a claimed thickness of 2.5 mm.
- Another company offering a similar wafer level camera called NemoCAM-02AVG is Nemotek of Morroco. Its website describes the unit having a thickness of 2.6 mm.
- the datasheet of the NemoCAM-02AVG is incorporated herein by reference.
- a wafer level camera is also taught in US Patent Appl. Publ. Ser. No. 20090309177 to Jeung et al. published on Dec. 17, 2009 which is incorporated herein by reference.
- the wafer level camera can be soldered to a flexible PCB. This may add about 0.5 mm at least to the thickness of the camera.
- the Nemotek camera module can be installed in a display that is 7 mm thick, as a housing can be created that has about 5-6 mm space if an ultrathin housing is created, for instance made from a high strength nanocarbon fiber reinforced shell.
- the shell can be reinforced with a structural foam.
- a ut/ul display having a thickness ‘d’ or less means that its greatest thickness at any location is not greater than ‘d’.
- the thickness is determined to be ‘a’+‘b’. So, with a thickness of 5 mm or less in such an embodiment the camera module has to fit within a housing that has not greater thickness than 5 mm anywhere.
- BGA Ball Grid Array
- FIG. 34 wherein a module 3401 is attached to a BGA connector 3402 in a unit 3400 .
- This connector still has to be soldered to a PCB, all adding to the thickness.
- all connections, including power are provided through a thin backplane 3403 of about 0.5 mm thick in a first embodiment, but not thicker than 1 mm in a second embodiment.
- the backplane 3403 may have a BGA connector 3406 or any slim connector facing to the top of the camera module.
- the connector 3407 can be about 2 mm thick without interfering with the shell.
- wireless chip-to-chip communication is enabled in for instance US Patent Application Publ. Ser. No. 20090175323 to Chung published on Jul. 9, 2009 and in the article RF/Wireless Interconnect for Inter- and Intra-Chip Communications by Mau-Chung Frank Chang et al. Proceedings of the IEEE, Vol. 89, no. 4, April 2001 pages 456-466 which are both incorporated herein by reference. This requires that the sensor chip of module 3401 is expanded with the communication part, which increases the size of the die.
- the module 3401 can be provided with power through a very thin power rail 3403 .
- the receiver chip is located in 3407 . While the same diagram of FIG. 34 is applied it is noted that specific components in this embodiment have a different function from the earlier provided embodiment in diagram of FIG. 34 .
- FIG. 35 shows in diagram how the camera unit 3405 in one embodiment is fitted against the underside of the surface 3502 of the shell of the housing of the display.
- FIG. 35 shows a diagram both in cross-sectional view and in top view.
- FIG. 31 shows in diagram one embodiment of a display body 3100 with a screen 3105 and a control 3106 and at least one lens/sensor unit 3101 and a connector 3103 to connect the display to the outside world. It is to be understood that related and required circuitry and connections as is known to one of ordinary skill are included in the body.
- at least one lens/sensor unit 3201 in a display body 3200 is fitted into the body at the side of the display at the other side of the screen.
- a control 3107 may be included on that side.
- the display includes at least one lens/sensor unit 3101 at one side and at least one lens/sensor unit 3201 on the other side of the body of the ut/ul display.
- the ut/ul display includes at least two lens/sensor unit 3101 at one side and/or at least two lens/sensor units 3201 on the other side of the body of the ut/ul display.
- both at least one lens/sensor units are operated at the same time to create an image which may be a video image. It should be clear that if a body of the display has a thickness of about 5 mm or less then lens/sensor units 3101 and 3201 are positioned not opposite each other in the body as there is not enough room for that.
- FIG 33 shows a diagram of an embodiment wherein one side of the display 3300 has at least two lens/sensor units 3301 and 3302 which may be enabled to create a panoramic and/or 3D image.
- the opposite side of the body may also have at least two lens/sensor units. Both at least two lens/sensor units in a further embodiment may be operated at the same time.
- a wafer level camera 3605 is fixed with its lens barrel through the wall 3603 and fit tightly with its backplane against the inside of the wall. Protective material 3603 to protect the barrel may be applied.
- an ut/ul display as provided herein be it a detached, attached or detachable display displays images as generated by the computing device in real-time or close to real-time, with a delay that is in general not significantly noticeable by a user.
- real-time video in one embodiment is video displaying at least 12 frames per second.
- the ut/ul display must be fairly large to serve its purpose as a large screen, thin and ultra light. It is believed that if a display does not meet certain dimensional criteria it will not serve its purpose as a detached portable large screen. It should have a screen of diagonal dimension of at least 10 cm, more preferably have a screen of diagonal dimension of at least 15 cm or preferably larger and preferably have a diagonal dimension of 20 cm, preferably even larger. Preferably it has a thickness of not more than 10 mm, more preferably not more than 7 mm, more preferable not more than 5 mm. Further, preferably a thickness should be less than 5 mm, preferably 3 mm or less, preferably 2 mm.
- the weight of the display should be preferably be less than 300 gram, in a further embodiment of the present invention less than 200 gram and in yet a further embodiment less than 100 gram.
- the portable and self contained ut/ul display should be of the size, weight, and thickness of a magazine that one would carry without being bothered by its weight.
- the ut/ul display has a screen that is a touch screen and wherein the display can communicate in a two way manner with the computing device.
- the ut/ul display is connectable to a small footprint device that in a stand alone mode is mainly a mobile phone, but when connected with the display provided with touch screen capabilities is a full blown operational computing device, enabled to perform computing applications such as the full Microsoft Office® Suite of applications (Outlook, Word, Powerpoint, Excel, and Access), games, Internet browsing and the like.
- Such a computing device in one embodiment has at least a processor, mass storage, display processing capability, communication capability, input and output capabilities and a minimal display, with an operating system and application software stored on the device while measuring not over 2 cm by 3 cm by 10 cm 3 in volume, preferably not more than 2 by 3 by 6 cm 3 in volume.
- eReaders At the time that this invention was made one available display would be an eReader. However, eReaders currently are not available as a separate portable display for a computing device with preferably at least one lens/sensor unit. In fact, one can conclude that eReaders (such as the Kindle II marketed by Amazon) are too small, too thick and too heavy to meet the requirements of the ut/ul display as defined above. eReaders are also not geared to display images in real-time, such as video images, generated from a lens/sensor unit in the computing device, which may be a smart phone with a camera.
- a separate and portable ut/ul display which is preferably self contained and is enabled to communicate preferably wirelessly with a computing device such as a cell phone is provided herein as one aspect of the present invention.
- the display is ultra thin.
- the display is an ultra thin ePaper or elnk like display in a thin housing, and in its entirety not thicker than 5 mm.
- the display with a portable housing has a thickness of 3 mm or thinner.
- the display screen material itself for instance as announced by E.Ink corporation of Cambridge, Mass. is about 2 mm or thinner. Additional thickness or bulk is provided by a housing and in particular by a ultra-thin battery to power the display.
- the large display as provided as an aspect of the present invention is very well enabled to display a panoramic image.
- the screen material in a further embodiment of the present invention is enabled to display 3D images.
- An external and separate mobile and portable display as disclosed herein works with a mobile and portable computing device.
- a computing device has at least a processor, memory, it has preferably but not necessarily a display that is smaller than an external display and a transmitter to send a signal to the external display and least one lens/sensor unit enabled to record a digital image and transmit the image to the external display in real-time or close to real-time, allowing a user to watch in real-time or close to real-time a video image in color recorded by the computing device.
- the computing device with at least one lens/sensor unit is a general computer working under an operating system and enabled to execute standard computer application software such as word-processing, e-mail, browsing, games and the like.
- the computing device with at least one lens/sensor unit is a mobile phone, enabled to connect wirelessly to a mobile phone network.
- the computing device with at least one lens/sensor unit as at least a mobile phone, enabled to connect wirelessly to a mobile phone network and a camera and is called a mobile camera phone.
- the computing device with at least one lens/sensor unit is a tablet PC with a screen smaller than the external display.
- the computing device with at least one lens/sensor unit is a smart phone, or PDA, enabled to connect wirelessly to a mobile phone network.
- the computing device with at least one lens/sensor unit is a stand-alone camera.
- a computing device has at least two lens/sensor units and is enabled to create in real-time a single image such as a video image that is created from at least two separate images.
Abstract
Methods and apparatus to create and display screen stereoscopic and panoramic images are disclosed. Methods and apparatus are provided to generate multiple images that are combined into a stereoscopic or a panoramic image. An image may be a static image. It may also be a video image. A controller provides correct camera settings for different conditions. An image processor creates a stereoscopic or a panoramic image from the correct settings provided by the controller. A plurality of lens/sensor units is placed on a carrier. Lens/sensor units are rotationally aligned. A controller rotationally aligns images of lens/sensor units that are rotationally misaligned. The camera is enabled to communicate via a wired or via a wireless connection, with a separate, mobile, ultrathin, ultralight, display with a large display screen not smaller than 25 by 20 cm in a first embodiment of the present invention and not smaller than 20 by 15 cm in a second embodiment of the present invention to provide color images including video images in real-time. A compact controllable platform to hold and rotate a device with a lens/sensor unit is also provided.
Description
- This application is a continuation-in-part of U.S. Non-provisional patent application Ser. No. 12/634,058 filed on Dec. 9, 2009, which is a continuation-in-part of U.S. Non-provisional patent application Ser. No. 12/538,401 filed on Aug. 10, 2009, which are all incorporated herein by reference in their entirety. application Ser. No. 12/538,401 claims the benefit of U.S. Provisional Patent Application Ser. No. 61/106,025, filed Oct. 16, 2008, and of U.S. Provisional Patent Application Ser. No. 61/106,768, filed Oct. 20, 2008, which are both incorporated herein by reference in their entirety. This application also claims the benefit of U.S. Provisional Patent Application Ser. No. 61/291,861, filed on Jan. 1, 2010, of U.S. Provisional Patent Application Ser. No. 61/322,875, filed on Apr. 11, 2010, and of U.S. Provisional Patent Application Ser. No. 61/365,347, filed on Jul. 18, 2010 which are all incorporated herein by reference in their entirety. This application is also a continuation-in-part of U.S. Non-provisional patent application Ser. No. 12/435,624 filed on May 5, 2009 which is incorporated herein by reference in its entirety. application Ser. No. 12/435,624 claims the benefit of U.S. Provisional Patent Application Ser. No. 61/089,727, filed Aug. 18, 2008, and of U.S. Provisional Patent Application Ser. No. 61/055,272, filed May 22, 2008, and of U.S. Provisional Patent Application Ser. No. 61/054,290, filed May 19, 2008, which are all incorporated herein by reference in their entirety.
- The present invention relates to digital image devices. More specifically, it relates to a thin and large and light weight display that is connectable to a portable computing device which has a digital camera. The camera being able to record single image in one embodiment and at least 2 images concurrently in a second embodiment and generate a stereoscopic image from those at least two images, or a panoramic image or a stereoscopic or panoramic image related to an existing image to be displayed on the thin and large connectable display.
- Digital cameras are increasingly popular. The same applies to camera phones. The digital images taken by these devices use a digital sensor and a memory which can store data generated by the sensor. Data may represent a still image. Data may also represent a video image. Images may be viewed on the device. Images may also be transferred to an external device, either for viewing, for storage or for further processing.
- Panoramic images are also very popular and have been created from the time of photographic film to the present day of digital imaging. Stereoscopic or 3D images are also created by combining two images. A whole range of tools exists to combine two or more images from a scene into a single, combined, hopefully seamless panoramic image or into a stereoscopic image. This process of combining is called registering, stitching or mosaicing, especially in the case of panoramic images. An advantage of a panoramic image is to provide a view of a scene that is usually beyond what is usually possible with a common camera and having no or very little distortion. For stereoscopic imaging the combining of two images requires to provide the appropriate horizontal parallax to create the right depth illusion.
- The processes of picture taking for creating a panoramic or stereoscopic image are processes that have many different technologies, apparatus and methods. Very common is the method of taking a first picture with a single lens camera, followed by taking at least a second picture at a later time, and followed by combining the pictures. This method is not very user friendly or requires complex cameras or complex camera settings. Furthermore, this method may be troublesome for creating video images.
- Images taken by a small computing device can usually be displayed on a small integrated display screen on the computing device. The quality and the size of such integrated display screens is generally barely adequate and mainly serves to provide feedback on reviewing an image, not for long time viewing of video or for high quality display of images. A large display screen would enable a better image viewing experience. However, a large display screen integrated into a portable computing device would defy at least some of the benefits of portability and small size. One would also not want to carry a separate heavy and bulky connectable display. It would be beneficial to have a large display screen in a thin, large and light display device that is easy to carry and connectable to the computing device if so desired.
- Accordingly, novel and improved ultra-thin, large and ultra-light connectable displays with a large display screen are required.
- One aspect of the present invention presents novel methods and systems for recording, processing storing and concurrent displaying of a plurality of images which may be video programs into a panoramic or a stereoscopic image.
- In accordance with an aspect of the present invention, an apparatus is provided for generating a combined image from at least a first and a second image of a scene with a camera having at least a first lens being associated with a first image sensor for generating the first image and a second lens being associated with a second image sensor for generating the second image, comprising an operational memory, enabled to store and provide data related to a first setting of the second lens including an active sensor area, the first setting of the second lens being associated with data related to a first setting of the first lens, a controller, applying data related to the first setting of the first lens for retrieving from the memory data related to the first setting of the second lens, the controller using the retrieved data for processing image data related to the active sensor area of the second sensor to create a stereoscopic image.
- In accordance with yet another aspect of the present invention the camera system is provided, wherein the camera system is part of a mobile phone.
-
FIG. 1 is a diagram of a camera for panoramic and/or images in frontal view in accordance with an aspect of the present invention; -
FIGS. 2 and 3 illustrate a panoramic image created in accordance with an aspect of the present invention; -
FIGS. 4 , 5 and 6 illustrate a stereoscopic image created in accordance with yet another aspect of the present invention; -
FIG. 7 is a diagram of a camera in frontal view in accordance with another aspect of the present invention; -
FIG. 8 is a diagram of a camera in rear view in accordance with a further aspect of the present invention; -
FIG. 9 is a diagram of a camera in frontal view in accordance with yet another aspect of the present invention; -
FIG. 10 is a diagram of a camera in rear view in accordance with yet a further aspect of the present invention; -
FIGS. 11-13 are diagrams of a camera in frontal view in accordance with one or more aspects of the present invention; -
FIG. 14 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with an aspect of the present invention; -
FIG. 15 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with another aspect of the present invention; -
FIG. 16 is a diagram of a carrier of lens/sensor units for a camera for stereoscopic and or 3D images in accordance with yet another aspect of the present invention; -
FIG. 17 is a diagram of a system having a camera and a display in accordance with an embodiment of the present invention; -
FIG. 18 is a computing device in accordance with an aspect of the present invention; -
FIGS. 19-21 are diagrams of a camera with a thin large sized display screen in accordance with one or more aspects of the present invention; -
FIGS. 22-24 illustrate detachable displays in accordance with an aspect of the present invention; -
FIGS. 25-30 illustrate connectable large displays in accordance with an aspect of the present invention; -
FIGS. 31-33 illustrate large displays with embedded lens/sensor units in accordance with an aspect of the present invention; -
FIGS. 34-36 illustrate small thickness cameras to be integrated with a large thin display in accordance with an aspect of the present invention; -
FIG. 37 illustrates a remote controlled platform for a camera in accordance with an aspect of the present invention; -
FIG. 38 illustrates a display in accordance with an aspect of the present invention; -
FIGS. 39-40 illustrate a remote controlled platform for a camera in accordance with an aspect of the present invention; and -
FIG. 41 illustrates a screen of a display in accordance with an aspect of the present invention. - In earlier inventions the inventor has disclosed methods and apparatus for creating panoramic images. These inventions will be briefly be recapitulated.
FIG. 1 shows in diagram acamera 100 having at least two lens/sensor units - An image sensor is associated with a lens. A lens may be combined with a sensor in a housing. The combination of a lens with its associated sensor is called a lens/sensor unit. A lens in one embodiment of the present invention is a fixed focus lens. In a further embodiment of the present invention a lens has a focus mechanism. A focus mechanism can be a manual focus mechanism, for instance by rotating the lens. In yet a further embodiment of the present invention, a lens has an auto-focus mechanism. Part of an autofocus-mechanism can be a
distance sensor 108 as illustrated inFIG. 1 . In one embodiment of the present invention, the multiple lenses are controlled by a single distance sensor and based on a measured distance all of the lenses of the at least two lens/sensor units are controlled via a single distance sensor. The distance sensor is shown as asensor 108 outside the lenses. However, in one embodiment of the present invention, the distance sensor for focus setting is a through-the-lens distance sensor. - In one embodiment of the present invention the at least two lens/sensor units are applied to combine two images into a single three-dimensional or stereoscopic image. In that case the camera of
FIG. 1 is a stereoscopic camera. In one embodiment of the present invention the at least two lens/sensor units are used to create a panoramic image of a scene. In that case there may be more than two lens/sensor units and the camera ofFIG. 1 is called a panoramic camera. - During a calibration an active area of an image sensor can be defined, the active area being smaller than the actual pixel area of the sensor. Pixels are generally read in horizontal lines, called scan lines. A sensor can also be read in not horizontal lines, vertical lines, slanted lines or in a pseudo-random pattern. The sensor data can be stored in a memory. In accordance with an aspect of the present invention a processor processes only pixels or pixel data related to an active area of an image sensor. In one embodiment of the present invention a memory only stores pixels from an active sensor area. In a further embodiment of the present invention a part of the memory is defined that only stores pixels generated by an active area of a sensor. In a further embodiment of the present invention a processor s instructed to read and/or process only data from memory related to an active sensor area. In a further embodiment of the present invention a processor is instructed to process data depending on its position in a sensor with regards to the active sensor area.
- It is well known that more than 2 lens/sensor units may be applied to generate a panoramic image. For instance 3 or more lens/sensor units may be applied. To keep the explanation simple only 2 lens/
sensor units FIG. 2 . Lens/sensor unit 201 corresponds tolens 101 andunit 202 corresponds tolens 102. In manufacturing it may be difficult to perfectly align theunits -
FIG. 2 shows threeobjects sensors merge line - Both sensors have an area of overlap, so when their images are combined along the
merge line 308, as shown inFIG. 3 , apanoramic image 300 is formed. One has to address the rotation before images can be merged. One may adjust for rotation by for instance instructing a processor to read pixels in such a manner that the read pixel lines are aligned. One may also create an adjusted scanning direction based on a measured rotational misalignment. These adjustments including interpolation take place preferably before demosaicing of the raw image data. In one embodiment of the present invention, areas and parameters in image data are defined to perform for instance smoothing and blurring and pixel equalization to create a smooth transition between two active pixel areas. In a further embodiment of the present invention the image data is stored and/or displayed as a panoramic image, or the processed or raw but aligned data is stored in such a manner that a panoramic image is formed by merging two images each created from active sensor areas. (keeping in mind that one may also use three or more lens/sensor units. - In one embodiment of the present invention the camera with at least two sensor/lens units is calibrated for different conditions such as distance or focus setting, to associate a condition such as focus setting with all required settings to create an appropriate panoramic image and store those settings in a memory. When during operation of the camera a particular condition is present, a controller retrieves all associated settings from the memory and applies all the settings in processing image data to generate the panoramic image.
- A similar but modified approach can be applied for creating stereoscopic images. This is illustrated in
FIGS. 4 and 5 . The issue here is also alignment of images. But rather than creating a wide image the issue in 3D images is to create two images with a predetermined amount of parallax, so that when displayed appropriately an impression is created of a 3D image.FIG. 4 shows astereoscopic camera 400 with two lens/sensor units 410 and 402. In one embodiment of the present invention,camera 400 has at least one lens focus sensor. In a further embodiment of thepresent invention camera 400 has an autofocus mechanism that focuses bothlenses camera 400. - Lens/
sensor unit 404 is shown to have a translation and rotation relative to 403. The rotational misalignment is an angle θr and the vertical (translational) misalignment is dv. The vertical misalignment may be measured from anobject 409 which is recorded as 409 a by 403 and 409 b by 404. In one embodiment of the present invention vertical alignment or mergelines image sensors line 406 defined by a feature of the object and theparallax line 409. The images also have a vertical displacement dv defined bylines -
FIG. 5 shows the aligned images to be appropriately displayed to create a 3D image. This may be done by a lenticular lenses display screen, by using parallax barriers, by using anaglyphs, by using alternately viewing left and right images by for instance using switching glasses or by any other appropriate 3D viewing system. - The rotational alignment is addressed as in the panoramic case. It is known that for most comfortable viewing by a viewer the rotational misalignment should be minimized. The horizontal and vertical alignment can be addresses by reading the appropriate pixel lines at the appropriate starting point. As in the case of the panoramic imaging one should calibrate the alignment for different conditions, for instance for different distances, zoom factors, light conditions and the like.
FIG. 6 shows the combined images to generate a 3D image with appropriate horizontal parallax and no vertical displacement. - The camera, in one embodiment of the present invention, is calibrated for different conditions such as distance or focus setting, and a condition such as focus setting is associated with all required settings to create an appropriate 3D image and store those settings in a memory. When during operation of the camera a particular condition is present, a controller retrieves all associated settings from the memory and applies all the settings in processing image data to generate the 3D image.
- The following 5 patent applications describe aspects of image alignment and calibration for stereoscopic and 3D images and are incorporated by reference herein in their entirety: U.S. patent application Ser. No. 12/435,624 filed on May, 5, 2009, U.S. patent application Ser. No. 12/436,874, filed on May, 7, 2009, U.S. patent application Ser. No. 12/538,401 filed on Aug. 10, 2009, U.S. patent application Ser. No. 12/634,058 filed on Dec. 9, 2009 and U.S. Provisional Patent Application 61/291,861 filed on Jan. 1, 2010.
- In accordance with an aspect of the present invention a camera is provided with three lens/sensor units for creating panoramic pictures. This is illustrated in
FIG. 7 with lens/sensor units camera 700. In one embodiment of the present invention an autofocus mechanism, including a focus sensor is included. In one embodiment of the present invention the lenses are single focus lenses. - Each of the images generated by these lens/sensor units can be combined to create a panoramic image. The camera may also have more than 3 lens/sensor units. The
units FIG. 8 shows the back side of thecamera 700 with adisplay screen 800 which in one embodiment of the present invention is divided in at least threesegments first mode segment 802 shows the image generated by lens/sensor unit 702. One may also display the image of other lens/sensor units in a display segment by selecting a display mode, for instance withbutton 804. - The
camera 700 contains at least a processor to process image data and to control the settings of the camera and the display screen, wherein thebutton 804 for instance can be used to select a display mode. Such a button may take the camera through consecutive modes by pushing it briefly and a mode is selected by holding the button for a longer period of time. The camera also contains memory to store pre-set camera and display settings, which have been entered for instance during a calibration. The camera also contains a memory to store image data. The camera also contains a processor to process the sensor data according to preset settings and the memory can hold the processed image data, which can represent a still image or video images and can be a single lens/sensor unit based image or an image constructed from at least two lens/sensor units. The components such as controller for instance for controlling lens focus, lens aperture and shutter speed, memories, sensor data and image processor are connected internally for instance by a data bus. Such structures are well known to one of ordinary skill in the art and are not shown in the figures to prevent obscuring the aspects of the present invention with known matter. Details of an internal processing and communication architecture of a digital camera is disclosed in for instance U.S. Pat. No. 7,676,150 to Nakashima issued on Mar. 9, 2010 and U.S. Pat. No. 7,667,765 to Turley et al. issued on Feb. 23, 2010 which are both incorporated herein by reference. - In a further embodiment of the present invention a control button enables a menu that is displayed on at least a segment of the display, allowing a user to select a mode of display and of taking an image. An image is recorded in a certain mode when the recording is activated by a button, for
instance button 807. In one embodiment of the present invention all images recorded by the sensors are stored and the selected mode determines how the recorded image is displayed. For instance, in one mode only the image taken by the center lens/sensor unit is displayed. In another mode the created panoramic image is displayed. - In one mode the display screen functions as a view finder. The images (either single, or multiple single or panoramic) are shown in that mode, but are not recorded. In another mode the display acts as a view finder and an image is recorded. Often, for instance during performances or during meetings, people take pictures from a scene in front of them. To obtain an unobstructed view for a camera they lift the camera above their head. The display of the camera, acting as a viewer, continues to display the images thus acting as a light for a person on a distance behind the camera. In accordance with an aspect of the present invention a button is provided on the camera that switches off the display while keeping the camera in a recording mode. Such a button in a further embodiment switches the image display to a separate handheld display that shows the image recorded by the camera while the embedded display of the camera is switched off. This embodiment of a separate handheld display of a camera will be further explained and provided herein.
- In a further mode an image is displayed on the display screen that is not being generated by one of the lens/sensor units but is for instance an archival image that is stored on the camera or that has been made available and transmitted to the display by another device. In one embodiment of the present invention the image is being displayed on the display screen that serves as a background to compare with an image from one or more lens/sensor units of the camera. For this purpose the image may be displayed in a low intensity. It may be enhanced with edges of objects in the image by an edge detector such as a Canny edge detector as is known in the art. The images viewed on the display screen that are provided by one or more of the lens/sensor units may be displayed in a transparent mode that overlays the archival or externally provided image.
- This embodiment of the present invention allows one to retake an image of a scene as close as possible to the original setting. The original picture may have been in black and white, or a picture with or without persons or objects. An image may also have been taken under different weather conditions, during a different season, time of day, or any other conditions that have been changed. One may want to retake an image or video that is as close as possible to the original setting or frame. For instance, a picture of a scene was taken with a single lens/sensor camera, and one wants to retake the picture as a video or as a panoramic or a 3D image. The overlay mode provided herein as one aspect of the present invention allows to retaking that image as closely as possible to the original image.
- In a further embodiment of the present invention one may also extract edges from the current image generated by a lens/sensor unit. Image registration by comparing edges and transformation of images is known. One may apply an image registration as is known to one of ordinary skill to match lens/sensor setting in such a way that the image generated by a lens/sensor optimizes a registration to the background image. Some aspects of using an archival image on a camera to improve an image are known and are described in U.S. Pat. No. 7,616,248 to Parulski et al. issued on Nov. 10, 2009 which is incorporated herein by reference. The archival images in Parulski are displayed and applied differently and not to retake an image that was taken at a completely different time. Parulski also does not provide an image that is provided by another camera. Parulski also does not disclose image matching and does not disclose using the archival image as a template for retaking the image as a video or as a panoramic or a 3D image.
- The
camera 700 also has an external port to dock to another device. Such a port may be a USB port, which allows the camera to be connected for instance to a computing device such as a computer for exchanging images. The port may also be used to connect the camera to a display having a large display screen to display the images taken by one or more lens/sensor units. The port may also be a wireless port. - In many cases a user may have a computing device such as a mobile phone that has a camera. Such camera may not be enabled to take images in a panoramic or a 3D mode. The
camera 700 in a further embodiment of the present invention as illustrated inFIG. 8 has acommunication port 806 that communicates directly with a mobile computing device such as a mobile phone. Such a port can be enabled to work in a wired mode. It may also be configured to work in a wireless mode. For instance, the camera may have aport 806 that is a Bluetooth port that can exchange data with a mobile phone. The mobile phone can providecamera 700 with an image that serves as an archival image. Thecamera 700 in one embodiment of the present invention provides the mobile phone with image data, for instance of a panoramic image. The cell phone may have a different resolution display screen than thecamera 700. In a further embodiment of the present invention images are stored in a high resolution and are downsampled on a device to be displayed in an appropriate resolution on a display screen. - The above provided capability of importing and/or comparing a generated image with an archival image enables a further aspect of the present invention. In accordance with a further aspect of the present invention one may import an image that is an image of an object that is segmented from its background. In a further aspect of the present invention the camera has processing capabilities to segment an object from its background. Such image segmentation algorithms are well known and are for instance described in an online lecture notes “Segmentation” by Professor Allan D. Jepson downloaded from <http://www.cs.toronto.edu/˜jepson/csc2503/segmentation.pdf> which is dated as Fall 2009 and which is incorporated herein by reference.
- Combined with the above disclosed method and device of overlaying an image, in one embodiment of the present invention the segmented image is inserted into an actual image being taken by a lens/sensor unit in the camera. One may modify the size of the inserted image by opening a menu on the camera to be shown on the
display screen 1002 as shown inFIG. 10 . Aninput device 1003 with different button modes can be used to activate a menu and select a menu option.Device 1003 may also act as a mouse to move a cursor over the display screen, to draw and select areas of interest which may contain an object and activate a menu option, which may be an object segmentation. One can thus select to be segmented object for instance by drawing a box around the object, which may be a person. One can also activate a segmentation program to segment the object from its background. The result, being an image of a segmented object, can then be selected by either drawing again a box around it or by clicking on 1003 after selecting it with a cursor. One can select a menu option to resize the selected segmented object. One can also move the selected segmented object on the display screen by moving a cursor withmulti-directional control button 1003. The segmented object may be displayed as a transparent image over the actually imaged scene by a lens/sensor unit. When satisfied with the size and position a user may confirm a position of the object on the display screen bybutton 1003 and change the image from transparent to opaque. One has thus placed a segmented object in the current image that can be recorded with the camera. - In a further embodiment of the present invention, even when the object was not originally recorded in 3D, the inserted image is synthesized into a 3D image from a 2D image. Methods and apparatus there to are known and are for instance disclosed in U.S. Pat. No. 6,445,833 to Murata et al. issued on Sep. 3, 2002 which in incorporated herein by reference. Preferably, a 2D image is synthesized into a 3D image with a parallax that reflects a desired distance from a point of view on the display screen.
- In accordance with a further aspect of the present invention one may import a plurality of objects from different sources. For instance, the camera may be connected to a network, and receive a plurality of images from different sources via the network. In a further embodiment of the present invention, each image has an identifier that allows the camera to determine its relative size and earlier positions a particular segmented image had on the display screen. This allows the camera to position the image on a location and at a size that it had in a previous image. In a further embodiment of the present invention a display with a display screen is separate from the camera. For instance the camera may be connected via a communication port to a computing device having its own display screen, such as a PDA or a cell phone. This allows a user to position the camera out of reach, focused on the user. In a further embodiment of the present invention control of the camera is provided to the computing device such as cell phone. In yet a further embodiment of the present invention the computing device can open at least one menu on the display screen of the computing device that enables control of the separate camera. This allows a user to be part of an image taken by the camera while retaining control of the camera. In one embodiment of the present invention the display and camera are physically separate and are connected through a wired connection. In a further embodiment of the present invention the display and camera are physically separate and are connected through a wireless connection.
- Imported images may be recorded in stereoscopic format. In a further embodiment of the present invention one may either receive the constituting separate 2D components of each image, or one may separate the stereoscopic image into the constituting 2D components and reassemble the components with a parallax representing a desired distance from a point of view on the device. In a further embodiment of the present invention an imported 3D image is reduced to a 2D image and the 2D image is synthesized into a 3D image having a parallax that reflects a desired distance from the object to the point of view.
- In another embodiment of the present invention a plurality of users have a camera with remote control and display screen as provided above. Each camera is trained and focused on the user and generates a segmented image that is shared with at least one other camera.
- In a further embodiment of the present invention a
camera 900 is provided for creating a 3D image from at two lens/sensor units FIG. 9 . While the camera may look the same as a 2 lens/sensor units panoramic camera, there are differences. In both the panoramic and 3D cameras the sensors have to be aligned appropriately. However, in the 3D case the lenses (or lens optical axis) should be substantially or almost parallel. In the panoramic case the lens optical axes should be positioned at a slight angle to create a broader field of vision. - The embodiment of the present inventions as provided above with regard to remote control and image insertion are contemplated to be applied to the panoramic and the 3D embodiment of a camera.
- Recently Sharp Corporation of Japan in a press release dated Apr. 2, 2010, which is incorporated herein by reference, announced” “Sharp Corporation has developed a 3D touchscreen LCD featuring the industry's highest brightness that can switch between 2D and 3D modes. Users can view 3D images without the need to wear special glasses, making this LCD ideal for mobile devices such as digital cameras, mobile phones, and smartphones. The newly developed 3D LCD uses a parallax barrier system to display 3D images. This parallax barrier, which has a series of vertical slits, is incorporated into an ordinary LCD to control the path of light reaching the right and left eyes, thus creating a sense of depth. However, the conventional 3D LCD with this system had difficulty in displaying high-quality images in 3D mode.”
- It may be that cellphone manufacturers are not able or willing to modify a device they market to accommodate 3D and/or panoramic imaging. In such and in other cases it may be advantageous for a user of such a device to purchase a connectable or even dockable device that can take and/or display panoramic and/or 3D images. For instance, one may take with a camera a panoramic image and send it to the device that is connected with the camera. One may also take a 3D image with the camera and display it on the camera, and send the single lens image for display to a connected computing device or the 3D image for storage and/or further transfer and/or non-3D display to the connected device.
- A camera, in accordance with an aspect of the present invention, is a 3D camera having at least 2 lens/sensor units. A camera, in accordance with another aspect of the present invention, is a panoramic camera having at least 2 lens/sensor units. A camera, in accordance with an aspect of the present invention, is a combined 3D and panoramic camera having at least 4 lens/sensor units. One embodiment of the
present invention 1100 of such a combined 3D/panoramic camera is shown in diagram inFIG. 11 . A combination of two lens/sensor units sensor units - In another embodiment of the present invention an additional lens/
sensor unit 1201 is placed between 1103 and 1104 as is shown in a diagram of acamera 1200 inFIG. 12 . This allows for a higher quality panoramic image. It also allows for using an image generated by 1201 to generate a neutral or reference image. - In yet another embodiment of the present invention as is shown in diagram in
FIG. 13 of a camera 1330 all lenses for 3D and panoramic image generation are placed on a single line. The lens/sensor units - Other configurations of lens/sensor units are possible and are fully contemplated. One may have multiple rows of lens/sensor units. One may arrange lens/sensor units in other patterns with overlap between images. In yet another embodiment of the present invention a camera may have at least two sets of two lens/sensor units wherein each set has lenses on parallax distance with parallel optical exes, and wherein the optical axes of the first set and the at least second set has an angle to create at least two panoramic images which are combined into a panoramic 3D image.
-
FIG. 14 illustrates another aspect of the present invention. It shows a lens/sensor carrier 1400 with lens/sensor units aligned attached. It was explained above that sensors have to be rotationally aligned to create appropriate panoramic images and to diminish negative effects on 3D images. One way to assure alignment is to attach at least two lens/sensor units rigid carrier material 1401. The lens/sensors units can be aligned during manufacturing and/or assembly of 1400 and should be fixed in their aligned position.Material 1401 should be rigid with negligible bending and distortion over the life-time of the camera. This requires that the carrier will also not be bent or distorted after being assembled into a camera. Alternatively, one may fix at least one lens/sensor unit in an aligned position after the carrier is fixed into the camera. As another alternative, amotor 1405 that can rotate the lens/sensor unit can be included on thecarrier 1400. This allows for motor driven alignment of the lens/sensor unit. The motor can be a stepping motor with a mechanism that significantly reduces the rotation angle of the lens/sensor unit per step. Per motor step the rotation of the lens/sensor unit is well below 1 degree in one embodiment of the present invention, and smaller than 1 minute in yet another embodiment of the present invention. A motor may be an actuator such as a piezo-electric actuator. Piezo-electric actuators are available that has a minimum displacement that is below 1 micron, thus allowing rotational displacement of a sensor that is in the single pixel range. -
FIG. 15 shows a further embodiment of the present invention of a lens/sensor assembly 1500 with 3 lens/sensor units. Herein an additional lens/sensor unit 1502 is provided. Also lens/sensor unit 1403 is provided with a motor oractuator 1501 to allow mechanical alignment of the sensors. -
FIGS. 14 and 15 illustrate thecarriers FIG. 16 shows in diagram thecarrier 1500 in side view.FIG. 16 shows that the optical axis of 1404 is rotated by aslanted support wedge 1502. The rotation angle effected by the support in one embodiment of the present invention is less than 10 degrees. The rotation angle effected by the support in another embodiment of the present invention is more than 10 degrees but less than 30 degrees. The rotation angle effected by the support in another embodiment of the present invention is more than 30 degrees - In yet a further embodiment of the present invention, lens/sensor units may be placed on a carrier and bonded to the carrier in a high precision environment. For instance, at least two sensor units may be accurately placed on the carrier and at least rotationally aligned relative to each other. One can align the sensors in a calibration environment by providing an alignment image and using a mechanical or robotic high precision placement mechanism. After fixing the sensors in at least rotational alignment one may place a lens mechanism over a sensor. In that sense a carrier becomes an integrated panoramic and/or 3D imaging unit that is robust and can be placed in a housing of a camera. Electronic components and communication connection may also be placed on the carrier. The most sensitive stage, which is probably the rotational alignment can take place in a manufacturing or laboratory environment that accommodates this type of high precision positioning and bonding. Such high precision environment is for instance present in semi-conductor processing and is known in semiconductor steppers.
- In an additional step one may also provide translational alignment of the sensors or the lens/sensor units.
- Once the carrier unit with lens/sensor units fixed in aligned position one may install additional components such as lenses, zoom lenses, auto-focus mechanism, focus mechanism for the lenses if required (when the lenses are not fixed focus lenses, which is also possible in a further embodiment of the present invention) and zoom lenses, aperture setting mechanism, shutter mechanism, light and distance sensors and the like which are all known components in photographic imaging. One may also provide the correct dimensions, fixtures, buttons, controllers, processors, memory and other electronic components on the carrier. In a further embodiment of the present invention the carrier with at least the lens/sensor controllers and supporting electronics and connectors may be in a shape and completeness that allows a camera manufacturer to obtain a complete either panoramic, 3D or 3D/panoramic unit that can be assembled with housing, buttons and knobs, and a display screen into either a complete panoramic, 3D or 3D/panoramic camera.
- In a further embodiment of the present invention the carrier may include an external communication link or may be attached to a communication link or device in the housing that allows communication such as the exchange of data, which may be image data or control/instruction data with the outside world, such as a computing device.
- Preferably, the calibration steps that were already disclosed as an aspect of the present invention are performed during or right after an initial power up of the electronics on the carrier with the lens/sensor units attached and the settings stored in for instance a memory, such as a ROM memory. If one achieves a reliable level of accuracy and repeatability of creating the carrier with lens/sensor units attached one may skip the actual calibration step for the mass produced carriers, and replace this by downloading the calibration settings from an external memory or implement them in an electronic device on the carrier. The calibration steps are determined from a mother or standard carrier, which may be used to test the performance of mass produced carriers against.
- In summary, a camera is provided that is a panoramic camera, a 3D camera or a 3D/panoramic camera. The camera may be a stand alone camera. It may also be part of a computing device or a mobile phone. The camera has a communication interface to communicate with a network or with a computing device to exchange data, including image data. The camera may transfer image data to the network or computing device. The camera may also transfer status information to the network or computing device. A computing device may have a display screen for displaying an image taken by the camera. The camera may also receive data from the network and/or network. The camera may receive image data, instructions to perform certain tasks or processing updates from the network and/or computing device. The camera may be connected to for instance a PDA or mobile phone to act as an external imaging device. A connection between the camera and computing device such as a mobile phone may be established via a wired connection, including a USB port or a wireless connection such as a Bluetooth® connection.
- In a further embodiment of the present invention the camera may also have a connection directly with a display screen wherein a display screen is enabled to display image data but is not enabled to run general purpose computer programs. Devices to display stored documents are known, for instance as eReaders from for instance Sony® and Amazon's® Kindle®. Apple's® iPAD® has its own display screen, but has additional computing capabilities and is in fact its own computing device. Computing devices with a display screen are another embodiment of the present invention with which the camera as provided as one or more aspect of the present invention can communicate with to display image data in real-time or substantially in real-time. Some minimal transmission delay and buffering delay may create a somewhat delayed real-time experience. It is pointed out that real-time or substantially real-time display of an image, graphics, text, or an interface generated by a computing device herein is different from a computing device completing an image, text, document or otherwise completed file, transmitting it to a display and the display processing the file to a display screen.
-
FIG. 17 illustrates a display embodiment of the present invention. Thecamera 1700 which may be a panoramic/3D camera in accordance with an aspect of the present invention has a communication port/interface 1701 to communicate with acommunication port 1708 of adisplay 1705. Thedisplay 1705 has adisplay screen 1706 which may be enabled to display 3D images. The display has display screen controls 1707 to allow a user to provide instructions related to an image. The controls may be used to adjust contrast, intensity, size of images, instructions to move forward, back ward in a series of images, zoom, capture, store recall images and the like. In one embodiment of the present invention, thedisplay 1705 also has a processor and memory/storage to allow processing of an image and storage and retrieving of images. In one embodiment of the present invention storage capabilities are temporary and limited to a session, for instance lasting for a period of time that thecamera 1700 is connected to thedisplay 1705. In one embodiment of the present invention processing capabilities of thedisplay 1705 are limited to image processing and do not provide general computing capabilities. - In yet a further embodiment of the present invention, the display is connected to a
mobile computing device 1800 as illustrated inFIG. 18 . Themobile computing device 1800 may include a cellphone with a dial pad 1803 a display screen 1802 apointing device 1804 cellphone capabilities and anantenna 1801 to connect to a communication network. It also has acommunication interface 1805 which allows communication withdisplay communication interface 1708.Device 1800 also may have a communication interface to communicate withcamera interface 1702. Communication between device may be wired or wireless communication links. - The
device 1800 has apowerful processor 1806 enabled to run an operating system such as Microsoft Windows® or Linux or any other operating system and applications such commonly used on a computer such as a wordprocessor, presentation software, communications, e-mail, browsers, spreadsheets, games, scientific software and the like or any other computer program that can run on a computing device. Thedevice 1800 also hasmass storage capabilities 1807 to store programs and data and the like. In a further embodiment of the present invention thedevice 1800 has aprojector 1808 which is enabled to project on a screen or a wall an image. It may also have aprojector 1809 which is enabled to create a virtual keyboard or input device, such as disclosed by U.S. Pat. No. 7,340,077 to Gokturk issued on Mar. 4, 2008 which is incorporated herein by reference. - The size of
device 1800 is very small, and can be carried in a jacket or trouser pocket or a purse and is significantly smaller than present day laptop computers such as manufactured for instance by Hewlett Packard® or Apple®. Such a complete computer can be so small because they do not have bulky input and output devices or storage media such as large magnetic disks. - In a further embodiment of the present invention the
device 1800 has included thecamera 1700. - The problem of lacking input/output devices can be solved with locally positioned
display devices 1705, which may be eReader devices as disclosed above. The devices may have a keyboard or a virtual keyboard for instance created withscreen 1706 being a touch screen or aprojector 1710 which may project a virtual keyboard. - In common situations where people are known to use a computer, such on an airplane, in one embodiment of the present invention a display is part of an airplane seat. A keyboard may also be part of a seat. The display and keyboard have communication ports to connect to a
computer 1800 of a passenger. - In yet a further embodiment of the present invention a camera is part of an eReader. This is shown in diagram in
FIG. 19 . The term eReader, herein is used as a term for a thin, large, lightweight display that acts as a true display, enabled to provide still and video images in full color in real-time, but which is much thinner than a tablet such as the Apple iPAD®. An eReader in its known form displays a content of a file. The term eReader as an aspect of the present invention includes the capability to show screens and images in real-time. - It is believed by the inventor that computing is still not fully mobile. A computer requires a processor, storage, input device and a display. Current devices are compromises along the different properties of a computing device. If they are small, they have usually small displays. With larger displays, computers become bulky and look like the current tablets or laptop computers. By putting all processing, storage and communication capabilities in a very small footprint device and providing a large, thin and lightweight display that is separate but connected, wired or wirelessly, to the computing device, a very portable, mobile and fully usable (with no screen limitations) portable and mobile computer has been provided. The unexpected aspect is that the usability of the device is improved by separating screen/display from the actual processor containing device, which is nowadays common practice in laptops, PDA, mobile phones, smart phones, known eReaders, tablets and the like.
- The
novel eReader 1900 has ascreen 1901. Such a screen is larger in surface than a standard camera display screen. Alens 1902, which may be a plurality of lenses as discussed above, each lens associated with an optical sensor is/are captured or manufactured into a housing of the eReader or otherportable device 1900 that has a screen. Though many eReaders currently have a black/white screen which is known as ePaper or elnk, manufacturers at the time of this invention have indicated to move into the area of color display screens. There are different technologies that can be applied in high quality large color display screens. One embodiment is by the technology of Organic Light Emitting Diode or OLED display screens. Another is a color electrophoretic display screen such as disclosed in U.S. Pat. No. 7,167,155 to Albert et al. issued on Jan. 23, 2007 which is incorporated herein by reference. A further possibility is to use a 3D screen as manufactured by Sharp Corporation as was discussed above. In one embodiment of the present invention one thus creates an eReader with a multi-lens camera as shown inFIG. 19 . In a further embodiment of the present invention a single lens camera as is shown inFIG. 20 is part of an eReader. Its shows aneReader housing 2000 with ascreen 2001. Incorporated in the housing is a single lens/sensor unit 2002. In an embodiment of the present invention ofFIG. 19 an eReader is provided that is also a 3D camera or a panoramic camera or a 3D/panoramic camera. These cameras are contemplated to have one or more aspects of the cameras as already described herein. - The advantage is that the display has a large display screen and is portable and preferably has a light weight. The size of a screen in one embodiment of the present invention is not smaller than a diagonal measurement of 6 inches. In another embodiment of the present invention a size of a screen is not smaller than a diagonal measurement of 10 inches. In another embodiment of the present invention a size of a screen is not smaller than a diagonal measurement of 15 inches. In one embodiment of the present invention the length and width of a display screen is the same. In another embodiment of the present invention the length is greater than the width.
- The large screen size allows an un-paralleled direct view of a recorded image. Even at relatively low resolution this is an attractive option. The current Apple iPAD® has a color screen with a resolution of 1024-by-768-pixel resolution at 132 pixels per inch. A novel eReader with a camera and a screen of such a resolution as provided as an aspect of the present invention will provide a completely new experience and enjoyment of taking pictures and video images with instant and real-time review capabilities.
- A novel eReader is provided as an aspect of the present invention to include additional functionality as described previously taking pictures or videos.
- In another embodiment of the present invention,
devices image sensors 1902 captured or manufactured into a housing of thedevice 1900.FIG. 19 shows howlenses 1902 are provided at the opposite side of the housing related to the screen or display. One may provide a housing with lenses that is hinged to a housing of the display, so one can swivel the lenses in relation to the display. - In another embodiment of the present invention, the devices as shown in
FIGS. 19 and 20 are only a camera with a very large display screen and memory to store data and processing capabilities to display images from an image sensor or from memory on the display and a control to control the camera and the screen. In one embodiment no other functionality is required and the devices are just digital cameras with a very large viewing screen. In a further embodiment other capabilities are added. In a further embodiment these cameras with very large screens have a communication port to off-load data such as image data, for instance to a computing device. In a further embodiment these device are also enabled to receive data through a communication port. In yet a further embodiment, these large display screen cameras have additional processor and programming capabilities. Research into display housing and camera housing at the time of the present invention has not turned up existing, completed or ongoing research to create such a camera, which is believed to be novel and non-obvious. It appears that no one has currently or previously articulated what the properties of a large screen, thin and highly portable camera should be. No consideration has previously been given what some properties of such a camera should be and what inventive steps are required to achieve such a desirable camera or as will be provided later a large screen ultra-thin and ultra-light and connectable display device. - The realization of this type of camera and display device is enabled by providing thin, sturdy, light weight housing, small and highly integrated powerful processing, memory and communication capabilities, very small lens/sensor units with appropriately sized connectors, light weight display screens and appropriate power sources. This will enable a portable camera or display device that is ultra-light and ultra-thin and thus ultra-portable with a very large display screen offering a novel experience in still image and video recording and viewing. In one embodiment of the present invention, the large display screen camera has only one lens/sensor unit. In another embodiment of the present invention the large display screen camera has at least two lens/sensor units.
- The camera with large display screen is characterized by its format and its light weight and being thin. It is portable and easy to carry and to store for instance in a briefcase or portfolio. It has a format that provides a large display screen, that is thin and low weight. For instance the display screen has a diagonal that is larger than 10 cm in one embodiment of the present invention, larger than 15 cm in another embodiment of the present invention, larger than 20 cm in yet another embodiment of the present invention and larger than 30 cm in yet another embodiment of the present invention. The thickness in one embodiment of the present invention is less than 3 cm, less than 2 cm in another embodiment of the present invention, less than 1 cm in yet another embodiment of the present invention and less than 0.5 cm in yet another embodiment of the present invention. The weight in one embodiment of the present invention is less than 20 ounces. The weight in another embodiment of the present invention is less than 15 ounces. In yet another embodiment of the present invention the weight is less than 10 ounces. The weight and thickness will be determined by power supply, housing, and display screen and the way of selecting, modifying and combining these elements into a device.
- The construction and assembly and its components of an eReader is for instance disclosed in a Kindle II teardown described in 5 articles on a website which includes <http://techon.nikkeibp.co.jp/english/NEWS_EN/20090421/169102/> dated May 12, 2009; <http://techon.nikkeibp.co.jp/english/NEWS_EN/20090422/169162/> dated May 13, 2009; <http://techon.nikkeibp.co.jp/english/NEWS_EN/20090424/169296/> dated May 14, 2009; <http://techon.nikkeibp.co.jp/english/NEWS_EN/20090427/169398/> dated May 15, 2009 and <http://techon.nikkeibp.co.jp/english/NEWS_EN/20090428/169498/> dated Mar. 18, 2009 which are all incorporated herein by reference.
- The lens/sensor unit may be an integrated lens/sensor unit for instance as available from Omnivision of Santa Clara, Calif. of which one example is the CameraCube. Samsung of Korea also manufactures very thin and powerful camera modules. For instance on URL <http://www.engadget.com/2008/03/18/samsung-announces-worlds-thinnest-8-megapixel-cellphone-module/> on Mar. 18, 2008 a small ultrathin camera module was shown that was about 8.5 mm thin and has a 5 Megapixel sensor and a lens. It is about 1.5 by 3 cm in width and length. Such a module can be incorporated in a thin housing of a large display or of a reader. The module has internal processing capabilities. It is also provided with a connector that makes it easy to connect to other components of a camera, to a processor, to a controller, to a bus or other communication devices and to a display. In fact several of these modules can be incorporated in the housing to enable a camera as disclosed herein in accordance with one or more aspects of the present invention. A copy of the article/website is incorporated herein by reference.
- Camera controllers are also known and are for instance disclosed in U.S. Pat. No. 6,727,941 to Coleman, issued on Apr. 27, 2004 which is incorporated herein by reference.
- A large display screen system with a controller is the iPAD® of which a teardown is provided in <http://www.isuppli.com/News/Pages/User-Interface-Focused-iPad-Changes-the-Game-in-Electronic-Design-iSuppli-Teardown-Reveals.aspx> which is incorporated herein by reference.
- In another embodiment of the present invention a large display screen portable device, such as a novel eReader, is enabled to communicate with a camera, such as a camera phone, by wired or wireless communication and is enabled to display an image or video taken by the camera.
- A large display screen camera such as provided in
FIGS. 19 and 20 in a further embodiment of the present invention are provided with a communication device to communicate with another device or with a network. One example of such an interface is a Bluetooth interface, of which an extensive description is available on <http://www.bluetooth.com/English/Technology/Building/Pages/Specification.aspx>. An example of a network communication interface is the Wi-Fi interface of which a specification is available on <www.wi-fi.org>. - If elnk or similar displays are used either in black/white or color, one issue may be the refresh rate of the screens. Refresh rates as low as 0.1 sec have been reported. Though refresh rates of 0.5 up to 1 sec are also common. High quality video requires about 50 or 60 frames per second. These can be achieved by high quality display screens. In case low refresh rate display screens are used one may still display ‘sampled’ video images. For that purpose a display has a controller with a clock that selects images from a sequence of video images that may be stored on a memory in the camera to provide a lower video frame rate to accommodate the display screen. In one embodiment of the present invention a controller selects fewer than one in 10 of the frames from an image sequence to be displayed. Camera phones in general accommodate frame rates of about 15 fps. In another embodiment of the present invention a controller selects fewer than 25 fps but more than or equal to 10 fps from an image sequence to be displayed. In another embodiment of the present invention a controller selects more than 25 fps from an image sequence to be displayed. This means that a camera records high video rates, up to 60 fps or higher and stores these video sequences, for instance to be played on an external display, but selects a lower display screen rate to accommodate a refresh rate of a current display.
- In another embodiment of the present invention a camera records images at a lower than 60 fps rate. For instance a display screen of the camera may be the main display screen for displaying images. No higher quality fps display in such an embodiment of the present invention is envisioned, and only a reduced video frame rate is implemented. In one embodiment of the present invention the recording rate is less than 10 fps. In another embodiment of the present invention the video rate is lower than 25 fps but higher or equal than 10 fps. In another embodiment of the present invention the video rate is equal to or higher than 25 fps.
- In a further embodiment of a device having a large display displays image data that is to be displayed on a screen that is of a higher resolution quality than is displayed on the current display screen. In one embodiment the device with the large display screen performs downsampling of the higher resolution image data. In another embodiment a separate computing device performs downsampling of the higher resolution image data before it is transmitted to the display device.
- In one embodiment of the present invention a
thin camera device 2100 with alarge display screen 2101 as shown in diagram inFIG. 21 has a large display screen that is rectangular but not square. One preferred way to display an image such as a still image or a video is in landscape format as shown inFIG. 21 . In another embodiment of the present invention thedevice 2100 is a multi-purpose computer or an eReader for instance. As an eReader the device can be used in portrait mode. Amulti-lens unit 1902 as shown inFIG. 19 in the shown position can take 3D and panoramic images in landscape mode. One may want to take images in landscape mode even if the display screen is oriented in portrait mode. This is shown in an embodiment of the present invention as shown inFIG. 21 . Themulti-lens unit 2102 is part of a rotatable housing. This housing can be rotated by pushing for instance aprotrusion 2103 from position 2104 (in landscape position) to position 2105 (in portrait position). Inposition 2105 the image can be shown in diminished size as a landscape image on a display screen in portrait orientation. The image can be stored and be displayed as a landscape image on the display screen in landscape orientation also. - Stand-alone digital picture frames are increasingly popular. These allow the display of a picture taken by a digital camera on a computer display screen. Such a digital picture frame is for instance disclosed in U.S. Pat. No. 6,975,308 to Bitetto et al. issued on Dec. 13, 2005 which is incorporated herein by reference. However, these digital displays require a connection to connect to a computer device and/or a camera. In one embodiment of the present invention, a digital picture display is provided wherein the camera is part of the display as is shown in for instance
FIGS. 19-21 . - In a further embodiment of the present invention a display screen of a portable and self contained computing device is part of a detachable portable and self contained display. This is illustrated in
FIG. 22 with acomputing device 2200 and adetachable display 2206 with ascreen 2211. Thecomputing device 2200 communicates with thedisplay 2206 through aport 2201 in the computing device with acommunication port 2205 in thedisplay 2206. Thecommunication ports communication ports - In case the
computing device 2200 and thedisplay 2206 are detached and are both self contained, they both require their own power source, which may be a battery. Abattery 2207 to power thedisplay 2206 has been identified inFIG. 22 . In a further embodiment, thedisplay 2206 has aninductive receiver 2207 to power the display. Inductive power in that case will be provided from an external source and enables thedisplay 2206 to receive inductive power withpower receiver 2207. This aspect of inductive power has been disclosed in U.S. Pat. No. 7,741,734 to Joannopoulos et al. issued on Jun. 22, 2010 which is incorporated herein in its entirety. - The wired connection such as the USB wired connection can be used to have the power source of the computing device power the display. A USB connection is preferably a real-time USB connection and not a file oriented connection. A separate wire to a power source in another embodiment can also be used to power the display. It is noted that in a wired connection with a USB connection the USB mini-A or mini-B connector can be used. The actual receptacles and plugs therein are 3.1 mm and are sufficiently small to be used in a display housing that is 5 mm thick. However, the housing of the USB mini receptacles are about 8.1 mm thick. In accordance with an aspect of the present invention a mini USB receptacle is modified to fit in a display housing that in one embodiment is 7 mm thick or smaller. In accordance with an aspect of the present invention a mini USB receptacle is modified to fit in a display housing that in one embodiment is 5 mm thick or smaller. In one embodiment the receptacle of a USB Micro may be used, if needed modified to fit the needs of connecting the display with the computing device and allowing the computing device to power the display. In a further embodiment the housing of the receptacle is part of the housing of the display. The state of the art is sufficient to enable a wired connection between a computing device and a display with a receptacle in the display wherein the display thickness is smaller than 5 mm down to a display thickness of 3 mm and wherein the receptacle is part of the housing and the actual connector is about 1 mm thick.
- In one embodiment a battery is an ultra-light and ultra-thin battery made of nanocomposite paper as disclosed in the article “Flexible energy storage devices based on nanocomposite paper” by Pushparaj et al. in Proceedings of the National Academy of Sciences (PNAS) Aug. 21, 2007 Vol. 104 no. 34 pages 13574-13577 downloaded from its website www dot pnas dot org and US Patent Publication Ser. No. 20100050779 to Pushparaj et al. published Mar. 4, 2010 which are both incorporated herein by reference. A very thin battery not thicker than 1 mm is disclosed in U.S. Pat. No. 6,696,198 to Okahisa et al. issued on Feb. 24, 2004 which is incorporated herein by reference.
- In one embodiment of the present invention the
computing device 2200 and the display are combined as one apparatus. In such a configuration thecomputing device 2200 and thedisplay 2206 are connected and combined, for instance through a separable hinge withmale part 2203 with an axis and spring loadedball connector 2209 and afemale hinge part 2204 on thedisplay 2206 that is enabled to receive and lock the spring loadedball connector axis 2209. A user may separate thedisplay 2206 from thedevice 2200. - The separated display in one embodiment of the present invention enables a display to collaborate with the computing device within at least 1 meter distance from the computing device of each other. In a further embodiment of the present invention the display can collaborate with the computing device in a distance not exceeding 3 meter of each other. In yet a further embodiment of the present invention the display can collaborate with the computing device in a distance not exceeding 10 m. In yet a further embodiment of the present invention the display can collaborate with the computing device in a distance exceeding 10 meter of each other.
- In a further embodiment of the present invention the computing device has at least one lens/
sensor unit 2201 which allows the device to record an image which may be a video image. In yet a further embodiment of the present invention thecomputing device 2200 is a camera phone or a wireless phone with a camera. In yet a further embodiment of the present invention thecomputing device 2200 is a camera. - The detachable display has several advantages. One advantage is that one can hold up the camera in a position for instance above one's head where it would be almost impossible to conveniently view its display. By detaching the camera, one can hold up the camera above one's head and still view the to be recorded scene on the detached display. In yet a further embodiment of the present invention, the
display 2206 has alocal control 2210 that activates the camera part of thecomputing device 2200. This also allows a person to make an image from himself/herself. - In a further embodiment of the present invention a
computing device 2300 as illustrated inFIG. 23 has at least two lens/sensor units display 2206 inFIG. 22 is illustrated from it's a view when it is folded open when attached. When folded closed ascreen 2211 is visible. One may also modify this embodiment, by having a screen visible when the display is folded open. - However, when taking an image without detaching the display but with the display folded open, the screen is not visible by the user in one embodiment. One may address this by using a ball type hinge that allows the display to be rotated so the screen becomes visible to the user. In a further embodiment a hinge part 2112 may be provided to the display that allows the display to be hinged on the computing device so it faces the user. One may also provide an additional display screen on the computing device that is not detachable, so that the computing device has a detachable display and a fixed screen.
- In a further embodiment of the present invention, the detachable display has a screen at both of its sides, as is illustrated in
FIG. 24 with ascreen 2401. In yet a further embodiment a computing device has an integrated screen that cannot be removed and a detachable display that has at least one screen. The detachable screen in a further embodiment has two screens, one at each large flat side. - In general, a computing device such as a PDA, a cell phone, a smart phone, a camera and the like has its own display screen. Such a display is generally an integrated display that is not detachable from the computing device. With the increased miniaturization of computing devices there is a trend to minimize the size of a computing device and consequently the size of the display screen that is part of that device. On the other hand, cell phones and smart phones and the like are starting to become the main mobile device, including the main camera, for consumers. One or more aspects of the present invention allow the mobile phone to become the primary computing device for a consumer.
- There is a desire to view images and other data on a larger screen. This creates a conflict with the desire for ultra small and ultra portable devices. One solution to that issue is to provide a small portable computing device which may include at least one lens/sensor unit with a very portable thin and light weight detached display with a large display screen. Such a display in accordance with an aspect of the present invention is separate, portable and self contained. The computing device display is able to recognize the display or vice versa the display is able to at least receive and use signals from the computing device. In one embodiment of the present invention, signals to the display from the computing device are provided via a wired connection. In a further embodiment of the present invention signals from the computing device are received by the display via wireless communications, for instance via radio signals such as via a Bluetooth or a wireless USB protocol, or via any other wireless communication that allows the display screen to show images generated by the computing device in real-time and not merely as files.
-
FIG. 25 illustrates such a display and the system it may be part of. Thedisplay 2501 has ahousing 2502 and at least onescreen 2504. The display with all components and housing is preferably ultra-tin and light and does not consume much energy, preferably a minimum amount of energy that can be provided by an inductive source that can be received by the display to power the screen or by an ultrathin battery that is not thicker than 1 mm and preferably thinner than 1 mm. Preferably the inductive power receiver or ultrathin battery 2506 (and which may not have been drawn to scale) is integrated in thehousing 2502. The display is provided with acommunication device 2507 that communicates with acorresponding communication device 2508 in aportable computing device 2503 that includes at least one lens/sensor unit 2509. In one embodiment thecomputing device 2503 is a mobile phone or smart phone. In anotherembodiment computing device 2503 is a stand-alone camera, preferably a digital camera. - In one embodiment of the present invention a portable and mobile computing device such as 2503 and a portable and mobile display such as an ut/
ul display 2501 even though separate form a system, wherein the computing device generates at least one signal that is strictly designated to be processed by the display and displayed by a screen of the display. The display is then uniquely configured and/or authorized to correctly process such a signal to create an appropriate image. A configuration or authorization can be achieved by encoding or encrypting the signal by the computing device and only provide an authorized display with the corresponding decryption or descrambling circuitry or program. In a further embodiment of the present invention the signal is modulated in a unique way, for instance by frequency or time-slot hopping, which can only be demodulated in an authorized way by the display or its embedded receiver. The above applies more specifically to a wireless connection between a computing device and a display. A unique and authorized connection can also be achieved by using a single and exclusive wired connection between the computing device and the display wherein the computing device is only connected with the display. The above embodiment clearly excludes any broadcasting application for connection to a plurality of displays. - In a further embodiment of the present invention a computing device connects in real-time with a plurality of displays, which may be ut/ul displays to provide a real-time image generated by the computing. In yet a further embodiment of the present invention such a real-time image is a panoramic image or a stereoscopic image generated in accordance with one or more aspects of the present invention.
- In one embodiment of the present invention a system is provided that contains at least one camera, such as a
device 2503 and a plurality ofdisplays 2501 which are configured and authorized to process a signal generated by the camera to display the image. In a further embodiment of the present invention each display contains a processor that is enabled to be programmed to be authorized to process the image signal generated by the camera. In yet a further embodiment of the present invention a coding processor generates a code or a program that is applied to a processor in a display to become authorized to process the image signal generated by the camera. In yet a further embodiment of the present invention the image signal generated by the camera is distributed by a network to a display, which may be the Internet or a mobile telephone network. - A stand-alone camera in this case is defined as a portable device that is strictly limited to taking or recording images, which may be still images in one embodiment and video images in another embodiment of the present invention. A digital camera is a camera with at least one lens/image sensor combination. An example of a stand-alone digital camera is an DSRL (digital single lens reflex) digital camera or any other digital camera for consumer or professional use. A stand-alone camera is not part of any other computing device such as a smart-phone or a mobile phone. In one embodiment of the present invention a stand-alone camera has real-time communications means such as a wireless WiFi connection or Bluetooth or real-time wired or wireless USB. This communication means in case of the stand-alone camera is mainly used to receive instructions and or image data and to transmit status date and/or image data. In another embodiment of the present invention the communication means is exclusively used to receive instructions and or image data and to transmit status date and/or image data. In a further embodiment of the present invention a stand-alone camera is used to also record sound related to a scene. In that case the communication means is also used to transmit and receive instructions and data related to an audio signal. A stand-alone camera in a further embodiment of the present invention has processing capabilities, for instance in image processing or other tasks to generate image data that represents a desirable image that may be provided with an audio signal. It is believed that a stand-alone camera that is connectable to a display as disclosed herein is novel. In a further embodiment of the present invention a stand-alone camera is connected to an ultra-thin, ultra-light display via a wired connection. In yet a further embodiment of the present invention the wired connection also provides power to the ultra-thin, ultra-light display.
- In one embodiment of the present invention a system is provided containing a stand-alone camera with a processor and memory and a connection to an ut/ul display to display in real-time an image being recorded by a lens/sensor unit in the stand-alone camera. In a further embodiment the stand alone camera contains two lens/sensor units. The stand-alone camera and the ut/ul display as provided herein are in one embodiment connected via a wire and in a further embodiment they are connected via a wireless connection. Despite having a wide angle lens or at least two lens/sensor units, a camera may not cover a desired field of view.
- In one embodiment of the present invention a
computing device stand 3900 as illustrated inFIG. 37 is provided with aplatform 3901 enabled to hold thecomputing device 3902 containing at least one lens/sensor unit, the computing device having at least a wired or wireless communication device to communicate with adisplay 4000 illustrated inFIG. 38 with ascreen 4001 which may be an ut/ul display. In case of a wireless communication connection thecomputing device 3902 has anantenna 3903 and the display has anantenna 4003. Thestand 3900 further has a rotatable ormovable platform 3905. Theplatform 3905 has at least one remotely controlled actuator such as an electrical motor, which may be a stepping motor or any other remotely controllable motor or actuator, which enables theplatform 3901 to be rotated in a remotely controlled fashion. The actuator is either bi-directional based on a control signal allowing it to move in a direction and a reverse direction, or an actuator contains at least two actuators, one working in a direction and the at least other one working in the reverse direction. - In one embodiment the platform itself has a receiver to receive a control signal for an actuator in the platform, for instance from the computing device or from the display. In yet a further embodiment of the present invention a rotating control signal is provided via computing device or
camera 3902 to a motor or actuator controller in the platform. For instance, both the platform and the computing device as well as the display are Bluetooth enabled. - In one embodiment of the present invention the platform, for
instance device holder 4102 has aconnector 4108 to make a connection withdevice 4101 as illustrated inFIG. 39 , wherein an actuator controller in the platform receives a control signal via 4101. Thus, by applying a remote controller a device holder attached to a platform which in one embodiment contains a computing device with a lens/sensor unit is instructed from a computer display to follow an object. In one embodiment of the present invention a computing device on a platform moves substantially in one plane. In a further embodiment of the present invention a computing device on a remotely controlled platform is also enabled to move in a vertical plane, for instance by using a vertical actuator such as atelescoping actuator 3906, or any other actuator that can change the orientation of the platform. - The use of a specific connector such as 4108, in one embodiment of the present invention requires that a device and a device holder on the platform having matching or corresponding features, and allows to create a platform or at least a device holder of the platform that is proprietary to a certain device.
- In one embodiment of the present invention a
remote controller 4002, which may be a joystick, is incorporated in the display. In a further embodiment of the present invention one or more sensors such asaccelerometers 4004 are attached to the display or are incorporated in the display. Such a display with accelerometers is disclosed in U.S. Pat. No. 7,688,306 to Wehrenberg et al. issued on Mar. 30, 2010 which is incorporated herein by reference. The purpose of the sensors is to determine a position or a change of position of the display. Based on the detected change in position a signal which is associated with the difference of position of the display is generated and is used to control or drive a motor or an actuator in the platform. Positional sensors and positional difference sensors are known and include accelerometers, magnetic sensors, MEMS with mechanical components, gyroscopes such as optical fiber gyroscopes, Hall sensors, inertial sensors, vibrational sensors and any other sensor that can be applied to determine a positional or angular difference or angular rate change or angular velocity or a linear rate change or linear velocity change of the display. These sensors will be called a positional sensor herein. Many of these positional sensors are available in very small formats, for instance integrated on one or more chips, which can easily be fitted inside a display, even in an ut/ul display. A change in position or angle of the display causes the actuator in the platform to be activated. - In one embodiment of the present invention, the actuator in the platform is activated based on the amount of change in the display from a neutral position. To stop the actuator from being activated the display has to be returned to its neutral position. A larger change in position by the display from its neutral position can create a faster activity by the actuator. In a further embodiment of the present invention, activity of the actuator depends on the rate of change of the position of the display. Also because the sensors are directionally sensitive and the actuator is, in a further embodiment of the present invention, a control element such as a button is provided on the display, which allows the actuator to respond or stop responding to a change of position of the display. This allows a user to return the display to a neutral position without activating the actuator.
- In a further embodiment of the present invention the display has a control element, such as a button, to reverse the moving direction in a substantially horizontal plane based on a movement of the display. When a display is moved from left to right the platform also moves from left to right. This is useful when a user of the display and the camera in the device holder face in the same direction. By activating a control on the display a user can reverse the direction of the platform, so that when the display moves from left to right the camera, seen from the direction a lens is facing, moves from right to left. This is useful when the camera and the user are facing each other. Activating the control again reverses the direction of the actuator as a result of a movement of the display.
- A user confirms in a certain position a zero position which corresponds with a zero position of a computing device. By moving the display with the positional sensors, the user can cause the movement of the platform and thus of the computing device with a camera and track an object or pan an area. In one embodiment of the present invention a fixed tracking plane for the platform can be set or programmed. In a further embodiment of the present invention the platform can be provided with a third movement, for instance in a vertical plane.
- The display can be one of a ut/ul display, a portable computer display, a tablet PC, a mobile phone, a smart phone, a PDA (personal digital assistant) or any other device including a display screen, at least one accelerometer, and a communication device to connect to a remote camera and a remote motor controller which is preferably a wireless connection but can also be a wired connection. One advantage of the above embodiments is that a user can track an object via a display rather than having to look through a viewer. In yet a further embodiment of the present invention a display has a processor that is programmed to instruct a motor controller to follow a pre-programmed track. In yet a further embodiment a tracking path of the platform is recorded and can be repeated by the display or the computing system.
- Cameras nowadays can be very small and are used in for instance computer gaming. The remotely controlled
platform 4100 for a computing device for instance including a small camera can also be small as is illustrated inFIG. 39 . For instance a movable platform has a bracket orfixture 4105 that can be fixed, for instance temporarily, against or on an object or a wall. Fixed on the bracket or structure is a motor oractuator holder 4104 which contains the motor or actuator controller and the motor or actuator that can move an axis orarm 4103 in a first plane which can be a horizontal plane. Also contained in 4100 is a receiver to receive actuator control instructions from the display. A camera/computing device holder 4102 attached to the arm is enabled to receive and tightly hold acomputing device 4101. In a further embodiment of the present invention a motor oractuator 4106 is attached between 4102 and 4103, enabling theholder 4102 to be rotated in a different plane for instance perpendicular to the first plane. - In yet a further embodiment of the present invention yet another actuator is included in the platform that allows the
holder 4102 to be rotated in a plane perpendicular to the horizontal plane in which arm 4103 can move and perpendicular toarm 4103. This allows in effect a rotation of the field of view of the camera incomputing device 4101 and allows for a remote correction of the horizon in the image taken by the camera of 4101 if the orientation of the platform is not aligned with an orientation of an object. -
FIG. 40 illustrates the platform viewed from above. The actuators in one embodiment of the present invention are provided with electrical power. A power source, such as a battery, in one embodiment of the present invention is included in the platform. In a further embodiment of the present invention, a battery in the computing device powers the platform, for instance via a power connection. In yet a further embodiment, power is provided from an external source, for instance via a power cord. - In one embodiment of the present invention a
platform 4100 is fixed against a wall, a structure or an object with a removable adhesive such as Scotch® Wallsaver™ Removable Mounting Square of 3M Corporation. Other attachment means such as magnets, Velcro® fasteners, pins, screws, straps, clips or any other fastening mechanism that allows the platform to be fixed to an object or structure are fully contemplated. This allows a computing device with a camera to be positioned, preferably in a removable manner on a non-intrusive location, for instance during a performance and allow a user to pan, track, focus and zoom on part of a scene or a performance. In one embodiment of the present invention a stand like a portable camera stand is provided. In one embodiment of the present invention a camera remote control is provided, which may be incorporated in the display. A touch type screen in the display in one embodiment of the present invention is enabled to display a camera control to control one of camera focus, camera zoom in/out, camera aperture, camera on/off and any other camera control. - In one embodiment of the present invention, after a platform has been positioned and camera settings have been provided a confirmation signal from the display keeps the camera platform and/or the camera in its preferred settings.
- In one embodiment of the present invention the camera platform and the camera in the computing device and the display collaborate to track an object or a person in a scene. This is illustrated in
FIG. 41 . Ascreen 4300 contains an image of aperson 4302 recorded by the camera in the computing device on the platform. Preferably, a user provides anidentifier 4301, such as an ellipse, around the image on a screen of the display and a processor in the computing device or the display is trained to learn characteristics of the object. This may be a segmentation learning, a statistical distribution learning, specific characteristics learning such as head and legs, or detection of a provided characteristic such as a colored tape or patch on the object or any other object image learning that allows an object to be detected in an image. After analysis or learning, in an operational mode the processor is enable to recognize or track the object or person in a scene. The processor in one embodiment determines atracking center 4303 of the object. A user may provide atracking box 4303 on the screen which determines a part of the screen wherein the object's or person's image has to appear on the screen. For instance one may want to have the object appear at the right side of the screen. The size of thebox 4303 determines the activity of the platform. If a person moves inside an area corresponding to 4303 the platform will not move, creating a relatively stable screen image. However, when the person or object moves outside the area corresponding to the box the platform will move. The location of for instance thetracking center 4304 determines the control signals provided via the processor to the motor controls of the platform. Image tracking is well known and is disclosed for instance in U.S. patent application Ser. No. 11/284,496 to Zhang et al. filed on Nov. 21, 2005 which is incorporated herein by reference. - A processor in one embodiment of the present invention is provided that is programmed to track a moving object from an image generated by a lens/sensor unit on the mobile platform. Based on the location of a tracking point in the image of the object a processor controls a motor or actuator in the mobile platform that allows a lens/sensor unit in a device held on the platform to keep the object in its field of vision.
- In one embodiment of the present invention the mobile platform is compact and light. For instance a holder of the mobile platform is about the size of a mobile phone, but made of light strips or a web of rigid or semi-rigid material. Combined with the motor(s), actuator(s), control and communication circuitry, arm and bracket the weight is preferably less than 500 grams, more preferably less than 250 grams and most preferably less than 200 grams. The mobile platform should be mobile and portable. Its size in should be able to fit in a box of a volume preferably less than 1000 cm3 in volume, more preferably about or less than 500 cm3 and most preferably with a volume of about or less than 250 cm3.
- Returning to
FIG. 25 , in yet another embodiment of the present invention thecomputing device 2503 has no lens/sensor unit and is a router or a router like device. Thedisplay 2501 has at least onecontrol 2505 to control the display screen, including but not limited to an on/off switch, a contrast control, a brightness control and the like. In a further embodiment thecontrol 2505 is a touch screen that is either separate or part ofscreen 2504. Thedisplay 2501 also contains control logic andmemory 2512. Thedevice 2512 may also contain a processor for processing image data that is to be displayed. In oneembodiment device 2507 receives a signal that contains image data.Device 2507, which may be a wireless device such as Bluetooth device or a WiFi device, or with a wired connection such as a USB port, can demodulate the received signal, for instance from 2508, into for instance a base-band signal that represents a displayable image that is provided to 2512 where it may be stored, if necessary only temporarily, to be processed and to be displayed on thescreen 2504. In one embodiment a demodulator demodulates the signal and an extractor extracts from the demodulated signal the image data. In a further embodiment the image data is arranged in a frame structure that allows the image data to be displayed on the display screen. In afurther embodiment 2512 may contain a mass storage chip to store a series of images, such as consecutive video images. - In a further embodiment of the
present invention display 2501 also contains apower inlet 2511 that allows it to be connected to a power outlet, for instance to apower outlet 2514 ondevice 2503 to power thedisplay screen 2501. In a further embodiment of the present invention the power connection between 2511 and 2514 includes a communication connection and may be a USB connection or a WiFi connection. This allows the display to be powered by 2503 while also taking care of the communication that otherwise would be taken care of by 2507 and 2508. - In one embodiment of the present invention the
computing device 2503 is a very small device that fits easily in a clinched first of a mature person. For instance such a device may have measurements of about 8*3*2 cm3 in volume such as for instance the Samsung® SCH-U470 model mobile phone or smaller. In yet another embodiment is larger than 8*3*2 cm3 but smaller than 10*3*3 cm3 in volume. In yet another embodiment is smaller than 10*4*4 cm3 in volume. The device has a screen for display of about 3*2 cm, or larger or smaller which is adequate for viewing information, but of course provides only a limited quality view of images. The advantage of such a small device or even smaller, is that it is extremely portable. Even smaller devices can be made with imaging, processing, communication, audio and storage capabilities or any combination thereof. The advantage of such a small or even smaller device with at least one lens/sensor unit is that a user can hold it in virtually any position to take an image. Unfortunately, a user can only review an image on the device after it has been recorded on the device. Adisplay 2501 that shows images taken by 2509 of 2503 in real time allows thedisplay 2501 to be used as a real-time viewer. In one embodiment at least one lens/sensor unit 2509 on adevice 2503 records an image, which may be a video image, and provides the related image data to acommunication device 2508 which transmits it to areceiving device 2507 ofdisplay 2501 to be processed by 2512 to be displayed in real-time or close to real-time onscreen 2504. In afurther embodiment device 2503 has at least 2 lens/sensor units from whichdevice 2503 creates a 3D or a panoramic image which is provided for transmission on 2508. Thescreen 2504 may be a 3D enabled screen to display 3D images. - In one embodiment of the present invention wherein a computing device is connected to a single display wherein a
screen 2504 of the connectable ultrathin, ultralight display is at least 1.5 the size in area and preferably at least 2 times the size in area of a display screen that is an integrated part of the computing device. In yet a further embodiment of the present invention the connectable ultralight ultrathin (ut/ul) display is preferably at least 5 times, more preferably at least 10 times, yet more preferably at least 20 times, yet more preferably at least 40 times and most preferably at least 100 times the size of the integrated display of the computing device. - In one embodiment the
screen 2504 works as a real-time viewer of lens/sensor unit 2509. In general a camera unit creates and stores an image. Such an image can be transferred to another device in the form of a file, including a RAW file. However, in at least the one embodiment of the present invention of this paragraph, thescreen 2504 shows the same image, which may be a video image as one would see on a viewer on the device. One should keep in mind that a camera viewer on the computing device in one embodiment is significantly smaller than thescreen 2504. This may mean thatscreen 2504 can show an image in a higher resolution than the integrated viewer on the computing device. This issue can be addressed in several ways. In one embodiment of the present invention the computing device recognizes thedevice 2501 and the display screen requirements. This may include the type of display screen and the signals formats required to display an image correctly on thescreen 2504. For instance ascreen 2504 can show four times as many pixel lines than the integrated viewer. This recognition may be achieved for instance by an identifying code that is transmitted bydevice 2501 during initiation of a connection and that is recognized by 2503.Device 2503 may have a memory that stores different display protocols for a possibleconnected screen 2504. When a specific identifying code is recognized an image processing application is activated that processes image data generated by 2509 into the required format required by 2504. In one embodiment that processed data is transmitted as a signal to thedisplay 2501 for display on 2504. In one embodiment the signal is provided to a driver of thescreen 2504 to generate the image on the screen. In a further embodiment additional signal processing, for instance amplification, is required to condition the signal for the driver. In yet a further embodiment, especially when the signal is transmitted wirelessly, the signal is a modulated signal, such as an RF (radio frequency) or infra-red signal. A wired connection may also require a modulated signal. In the case of modulation thedisplay 2501 has a receiver to receive the signal, and to condition the signal for demodulation, demodulates the signal, extracts the image data and processes it to be provided in the required format for the driver and provides the image data to the driver to drive the screen to show the image. Intermediate steps may be required to enhance a signal and to remove extraneous data that were used for correct communications, including but not limited to handshake data and data error correction data. - In a further embodiment the
device 2503 transmits image data to 2501 in a standard format such as JPEG or RAW. In that case thedevice 2503 has a processor that is programmed to transform the image data from its standard format to the format required by ascreen 2504 and its related driver. While this is a possible embodiment of the present invention it is a less preferred embodiment with a small and low power processor on thedevice 2501 that can achieve this. It is preferred to have most power intensive processing being done by thedevice 2503 and limit power intensive processing ondevice 2501. - In a further embodiment of the present invention, the
device 2503 stores at least one image or a video image. Thedevice 2503 is enabled to play for review stored images. In a further embodiment thedevice 2503 provides stored images for review on 2508 to be transmitted to 2507 to be reviewed on 2504. - In one embodiment of the present invention the display is a self contained and portable display that works with a computing device that includes a camera with at least one sensor/lens unit. Users already carry many devices, which may include a cellphone, a laptop, an MP3 player, a digital camera and the like. To carry an additional device, such as a
display 2501 requires the display to provide desirable advantages while avoiding substantial disadvantages. It is believed that an embodiment of a connectable display with a sturdy and rigid housing that has a very thin thickness with a large screen and a low weight offers a substantial advantage that is currently not available and has also not been articulated as such. It is believed that an embodiment of a connectable display that has a very thin thickness with a large screen and very low weight is a substantial advantage. It is believed that an embodiment of the present invention that provides a separate camera with a smaller display screen and a separate connectable display with a large display screen offers a substantial advantage. Furthermore, the ability to view large format images is believed to be a very substantial advantage, for people with normal vision and for people with impaired vision alike. Furthermore, the ability to view large format images generated in real-time by a camera on a separate display screen is believed to be a substantial advantage. Furthermore an embodiment of the present invention enabled to show images in full color is believed to be an advantage. Furthermore, an embodiment of the present invention to show video or video-like images in real time on the display screen is believed to be an advantage. - One disadvantage of a display is its weight and bulk if not appropriately dimensioned. This is currently demonstrated by available eReaders and tablet like computers such as the Apple iPAD®. First of all these devices do not have the advantages or aspects of the present invention. Secondly, they have bulky batteries. Even an advanced eReader such as the announced QUE proReader™ by Plastic Logic Ltd. of Cambridge, UK has some disadvantages as is disclosed on the fact sheet as released by Plastic Logic which is incorporated herein by reference. It displays still images, in grey shade colors, and has a thickness over 7 mm, with a weight over a pound. Seiko Epson reported an eReader in a 18 cm by 12 cm housing of 3 mm thick of 57 gram in 2007 at Embedded Technology 2007, an exhibition event in the Pacifico Yokohama convention center. The device has a USB port. However the device displays grey color and not video enabled images as was reportedly has a refresh of 0.8 sec. It was also not disclosed how data exchange was used. It appears that it supports document type transfer.
- While being portable, current devices that are on the market or that have been announced appear not to meet the requirements of being ultrathin, light and having a large display screen. A display screen that meets the requirements as an aspect of the current invention would be like a magazine or a paper notepad for instance of A4 format or larger. Such a notepad is extremely portable of light weight and only noticeable by its size, not by its weight or bulk and is very transportable without being intrusive.
- In one embodiment of the present invention the computing device connects directly with the display. This means that the computing device generates and transmits a signal that is received and processed by the display. No intermediate signal processing between the two apparatuses is required in case of wireless transmission. In a wired transmission only a single wire between the computing device and the display is used for a direct connection. In a further embodiment of the present invention a direct connection is autonomous and no other devices than the computing device and the display in a wireless connection are required. In a direct autonomous wired connection only the computing device, the display and the wire are required. The wire may contain circuitry, such as amplification circuitry. However, the wire is also portable and mobile, so that the computing device is enabled to be connected to the display with the wire without any further supporting infrastructure.
- One question arises: why would someone carry yet another device? It is believed that one will be enticed to do so with a display that offers the advantages of aspects of the invention. It is believed that the display will be an attractive must-have device that is currently not available if it meets functional aspects of the present invention as well as one or more dimensional and weight requirements. It appears that the upper limit of weight for an external display to be carried along is about 1 pound, with a screen size of at least 10 by 15 cm, preferably 15 by 23 cm and more preferably 8 by 11 inches and yet more preferably greater than 8 by 11 inches. The upper weight limit for all the preferred screen sizes is about 1 pound, more preferably 12 ounces and even more preferably 8 ounces or less. Even more preferable is a weight of 6 ounces.
- For an external display a weight of 16 ounces or less is preferable, a weight of 12 ounces or less is more preferable and a weight of 6-8 ounces is most preferable. A weight substantially less than 6-8 ounces and certainly less than 6 ounces is also preferable. A low weight is preferable for a portable display but a very light weight display weighing less than 6 ounces appears to lose a sense of substance. As a portable display with a minimum screen size of substantially or close to 10 by 15 cm appears to be a minimum requirement to carry an extra device. An even smaller size, such as a half postcard sized display screen size of about 7 cm by 4.5 cm is desirable and even more preferable if it is thinner than 5 mm and light, for instance 12 ounces or lighter or even more preferable 8 ounces or lighter. A larger size with low weight is also preferable. A minimum screen size of substantially or about 20 cm by 15 cm for a portable display is more preferable, while a screen size of a notebook or writing pad size (standard 8 by 11.5 inches or A4 in Europe) appears to be very preferable. Larger sizes at low weight are also preferred but less so for a portable display as it starts to become too large for being carried in carry-on luggage. A large screen as an aspect of the present invention is fully color enabled. In a further embodiment of the present invention it is also video enabled with a display rate of at least 12 fps.
- Sizes as provided herein, are sizes equivalent to an area that is defined by an area of a rectangle with one side L and one side W. The area is then L*W. So if a size is preferably larger than 15 cm by 10 cm then it means that a size is preferably larger than 150 cm2. In one embodiment one may prefer a rectangular shape. In another embodiment one may prefer a square shape. In yet another shape one may prefer an ellipsoid shape.
- Furthermore, it is a requirement to provide in one embodiment a portable and connectable display that is not bulky. It appears that a display with a thickness greater than 1 cm is not very desirable. In a first embodiment of the present invention a display has a thickness of 1 cm or less. More preferable is a display that has a thickness of 7.5 mm or less. Even more preferable is a display that has a thickness of 7 mm or less. Even more preferable is a display that has a thickness of 6 mm or less. Even more preferable is a display that has a thickness of 5 mm or less. Even more preferable is a display that has a thickness that is 4 mm or less. Even more preferable is a display that has a thickness of 3 mm or less. A thickness of 2 mm or less is also preferred. However, there is a requirement also of rigidity and sturdiness. A thickness of 2 mm or less is only preferred if the housing and the whole display maintains a feel of sturdiness and stiffness and rigidity. There is also a matter of cost to create the display to be considered.
- Computing devices with display screens of course exist. These computing devices have their display screen integrated into one body. As an embodiment of the present invention a display is a separate but connectable ultra thin/ultra-light display (UT/UL or ut/ul display). A ut/ul display is a portable external accessory to a computing device, like earphones. Current devices and displays are not ut/ul displays. First of all the displays are not separate from the computing devices. The Apple iPAD® for instance has a size of 242 by 189 by 13.4 mm with a weight of about 700 grams and clearly does not meet the requirements of a ut/ul display. The Apple iPOD® for instance has a size of about 110 by 62 by 8.5 mm with a weight of about 115 grams, wherein the actual screen is substantially smaller than the size of the device, and also does not meet at least the thickness requirements of an ut/ul display.
- By using materials, including housing materials, screen material and the like, in one embodiment with an average density of 1500 kg/m3 one can achieve an ut/ul display with a housing size of 30 cm by 25 cm with a display screen of 25 cm by 20 cm of with a weight of 113 grams per 1 mm thickness. A thickness of at most 4.5 mm in a box shape display, will still meet the requirements of an ut/ul display. This assumes a solid housing, which is not likely. More likely the housing has walls of 1 mm thick, preferably 0.5 mm thick, for instance of a fiber reinforced composite material. Furthermore, one may fill at least part of a housing formed by thin shells of about 1 mm thick or less with a foam such as a structural foam. One may for instance fill half shells with a foam that has a density significantly lower than 1500 kg/m3, remove or laser ablate the foam for the shapes of the components such as connectors, antennas, screens, circuits and the like, assemble the components into the shells and glue or fix the shells together. This enables a display with an average density of 900 kg/m3 and more likely an average even lower than 900 kg/m3 at a large size. This enables a display of 300 mm by 250 mm by 5 mm with a screen of 25 cm by 20 cm to have a weight less than 12 ounces.
- Part of the display is being formed by the screen and screen backplane, the housing of the display and the battery or power source (if used). The screen itself can be thin if an OLED or an electrophoretic display screen are used. The thickness of ePaper or elnk display screens with backplane has been reported to be in a range of 250 micron to 850 micron. Accordingly, display screens of large areas complying with the requirements herein are available with a thickness of about 500 micron (or micro-meter). Full color and video imaging is enabled by an electrophoretic screen using electrowetting. This is disclosed in for instance US Patent Publication Ser. No. 20100128015 to Feenstra et al. published on May 27, 2010, which is incorporated herein by reference in its entirety. Such screens have been demonstrated by Liquavista of Eindhoven in the Netherlands.
- The body of the display should be strong but also ultra thin. One way to achieve this is to use high-strength and high stiffness polymers. For instance one may use a polycarbonate material such as disclosed in U.S. Pat. No. 6,043,310 to Liu et al. issued on Mar. 28, 2000 which is incorporated herein by reference. One may use a housing with a maximum thickness of 5 mm or less. In one embodiment one may use a housing with a maximum thickness of about 3 mm or less. In yet a further embodiment one may use a housing of a maximum thickness of about 2 mm or less. In one embodiment one may use high strength polymer composite materials that have been reinforced by fiber materials. These composite materials are well known. In one embodiment carbon fiber reinforced composites are used. For instance in one embodiment a carbon reinforced polyimide resin is used to form a thin but strong housing. In another embodiment the housing is made of carbon nanotubes reinforced composites. Accordingly, a stiff housing for a display can be made with a thickness of less than 2 mm thick; of less than 3 mm thick; of less than 4 mm thick; and of less than 5 mm thick.
- A review of documents does not reveal anyone disclosing or pursuing limitations as described herein. It appears that a thickness of a display with a large display with housing of about 1 cm is pursued, with some finding a thickness over 7 mm ultrathin. Seiko showed a non-video enabled non-color ultra-thin device that is ultra-light but that is smaller than 25 cm by 20 cm. A weight of over 1 pound appears considered to be acceptable. These embodiments are embodiments of eReaders and computing tablets and not of ultrathin, ultralight and large connectable displays as described and enabled herein.
- There are many developments in display screen technology of ultra-light, ultra-thin and even flexible black/white, grey scale and/or color image display screen capabilities. In one embodiment these display screens have a refresh rate of about 1 sec down to 0.1 sec per frame to full video capabilities of 30 or more frames per second. One of ordinary skill will be familiar with these technologies. Some are still expensive, but are fully enabled to implement the screens of a display required as an aspect of the present invention. Two familiar display screen technologies are the OLED technology and electrophoretic display screen technology also known as electronic paper. In general such a screen is an electro-optic display screen. It may contain a material with for instance active elements or with Microencapsulated Electrophoretic Capsules. A field across the display screen activates the display screen elements in one embodiment. Such a display screen is disclosed in for instance US Patent Publication 20070247697 to Sohn et al published on Oct. 25, 2007 now U.S. Pat. No. 7,672,040 to Sohn et al issued on Mar. 2, 2010 which are both incorporated herein in their entirety.
- In accordance with one embodiment of the present invention the supporting electronic devices for communication, power management and screen display management are embedded in electronic integrated circuits. These integrated circuits are embedded in the plastic housing of the ut/ul display and provided with the required connections. How to embed integrated circuits into the plastic housing is known and is for instance disclosed in the article “Embedding of thinned chips in plastic substrates” by Vandecasteele et al. of IMEC-TFCG Microsystems in Zwijnaarde, Belgium as downloaded from the Internet and which is incorporated herein by reference. One may create a stiff and very thin housing of a double shell of for instance a total thickness of 3 mm and a shell wall thickness of 1 mm or less in a first embodiment and with a shell wall thickness of 0.5 mm or less in a second embodiment. In a further embodiment the total shell thickness is about 2 mm or less. If open room exists between the shell walls the shell can be filled with ultra light structural foam which provides additional strength while keeping the total weight low. The components such as chips and connectors can be fixed in or to the shell or the foam in different ways. In one embodiment foam or shell are partially ablated for instance by a laser such as a pulsed laser to create a shape in the shell or the foam that will receive the relevant components, which can then be placed and fixed into place, for instance by a glue bonding.
-
FIG. 26 illustrates one embodiment of an ut/ul display 2600 both infront view 2600 a and a crosssectional side view 2600 b. The display has a body orhousing 2601 with athin screen 2602 as was discussed above. Also included in an embodiment is an at least onecontrol element 2606 which may be an on/off switch. In a further embodiment the ut/ul display has aconnector 2603 for connecting an external power source. In the cross sectional view also at least one embedded IC orchip 2607 is shown.Chip 2607 is a further embodiment is representative for at least a power control chip, a display control chip (including a driver), and any other ICs that are required to enable the display. Adevice 2605 which is included in the housing in an embodiment represents a communication device, including an antenna in a further embodiment, to receive and process signals from a computing device that transmits signals representing an image. These signals are processed into further signals that can be processed by 2607 to be provided to 2602 for display of an image, which can be a video image. In a further embodiment, image signals may be provided through a wired connection.Device 2605 in such an embodiment can be configured to connect to a wire to receive the required signal. A wire may be an electrical wire or an optical wire such as an optical fiber. In a further embodiment it is preferred to use only one wired connection. In thatcase 2603 is for instance a USB port enabled to transmit power as well as signals such as an image signal. - In general a battery causes much of the bulk and weight of a computing device. Preferably,
screen 2602 is a low power consumption screen as is known in the art. In that case anultrathin battery 2604 which has a thickness of 1 mm or less and preferably of 0.5 mm is applied. Very thin Li batteries using Lithium Phosphorus Oxynitride (LiPON) as electrolyte in a FET configuration are known. Ultra-thin batteries are disclosed in U.S. Pat. No. 7,389,580 to Jenson et al. issued on Jun. 28, 2008 which is incorporated herein by reference. In a further embodiment, power is provided by inductive transmission as discussed above and 2604 represents such an inductive power receiver. - The housing serves to hold the screen and all the supporting electronics and provide stiffness to the display. It also serves to provide a user with the ability to hold the display at the edges. In a further embodiment of the ut/ul display the edges of the housing are tapered as shown in
FIG. 27 by 2701. This provides a user with a sense of thinness and saves in weight. - In one embodiment a portable display is provided that has a screen that is larger than a screen of the computing device it is connected to. In one embodiment the display has a thickness that is not greater than 5 mm. In a further embodiment it has a thickness that is not greater than 7 mm. In yet a further embodiment it has a thickness that is not greater than 1 cm. In yet a further embodiment it has a weight that is less than 12 ounces. Because the thickness and weight are not as restricted as in the above disclosed embodiments, one may use cheaper technology including batteries and connections and display screens and the like. It is currently possible to create very small and compact computing devices with camera capabilities, and with integrated storage, processing, rechargeable power and communication capabilities. Such devices can be fitted with a small display screen to provide minimal status information and with limited input controls. These devices can easily be about or smaller than 6 by 3 by 2 cm. However, in such a small size the use as a smart phone or computer is limited. Especially some form of convenient input and display device are missing. However, integrating a small computing device with a convenient keyboard and display screen will nullify its small footprint. In accordance with an aspect of the present invention a portable and connectable display is provided that has a screen size that is at least larger than the integrated display screen on the computing device. In a further embodiment a screen of such a display device is provided with touch screen capabilities, allowing it to be operated as an input device to a separate computing device.
- The computing device is enabled to work independently from an external display. It has its own input controls and a display screen. In accordance with an aspect of the present invention an external portable display is provided that can be connected to the computing device. In a further embodiment, the display has a screen that is at least partially or in its entirety a touch screen. The display is connectable to the computing device either wirelessly, for instance by using a Bluetooth, WiFi or other wireless communication connection. In such a case the connectable display should be provided with its own power source, which may be a battery. In a further embodiment of the present invention, the display is connectable with the computing device by wire, for instance with a USB connection. In that case the display may have its own power source or it may be powered by a power source that is provided to the computing device. In one embodiment a connection is a real-time connection. This means that the display screen displays an interface, an image or a document as generated or as being processed by the computing device and not as a completed and stored document. Potential delays or interruption in transmission of data between computing device and display can be addressed by applying a buffer at the display in one embodiment of the present invention and at the computing device in another embodiment of the present invention and in both the computing device and the display in yet another embodiment of the present invention. This allows to temporarily store data that is to be displayed but has not been done so.
- For instance the display may receive data from the computing device to be displayed on the screen, but the data does not establish a complete frame. A communication protocol, such as a handshake protocol, may inform the device that data is missing. The missing data will remain in a buffer on the computing device until the display confirms receipt of the data. One may also apply a buffer on the display that receives data at a slightly higher rate than is being processed. The display informs the computing device when the buffer is almost full, allowing the display to create a “buffer” of data for a maximum time of interruption. Data buffering techniques between a transmitting device and a receiving device are well known in the art.
- The size of the screen of the connectable ut/ul display is at least larger in area than the screen of the computing device. Preferably the ut/ul screen is larger in area than 7 cm by 5 cm; more preferably the ut/ul screen is larger in area than 10 cm by 7 cm; even more preferably the ut/ul screen is larger in area than 15 cm by 10 cm; even more preferable the ut/ul screen in area is larger than 25 cm by 19 cm; even more preferable the ut/ul screen in area is larger than 30 cm by 20 cm; even more preferable the ut/ul screen in area is larger than 30 cm by 25 cm.
- It should be clear that the use of a large ut/ul screen completely changes a mode in which a computing device can be operated. More data and portion of images and text or documents can be shown in a usable way than on a small integrated screen of a small computing device as provided above. In a preferred embodiment most of the data processing, storage and imaging and phone like circuitry and devices are located on the small footprint (8 cm by 3 cm by 2 cm in size or volume or smaller) or even smaller footprint (6 cm by 2 cm by 1.5 cm or smaller in volume or size) computing device. Such a device is portable and usable as a phone which may be wireless, MP3 player, camera (video, and/or panoramic and/or 3D), video viewer, data viewer, audio and video recorder data storage, data recorder and in any other form for which a computing device is and can be configured. Without an external display, viewing of data, documents and images is not as convenient or useful as with a large display screen. Furthermore, computer applications such as word processing, spreadsheets, presentations, image processing and the like can be stored on a small footprint device and can be processed by a processor on the device but cannot be viewed and/or displayed adequately. The use of certain computer applications is very much enabled when an external ut/ul display is connected.
- One apparent advantage of using an ut/ul display together with a separate computing device with a lens/sensor unit is the ability to view an image during recording without requiring to view a scene through the image viewer on the computing device.
- In one embodiment of the present invention the weight of the computing device is also limited. A large integrated screen in the computing device will add to the weight. So will a large battery. In a preferred embodiment of the present invention the computing device has a weight that does not exceed 250 grams. In a more preferred embodiment of the present invention the computing device has a weight that does not exceed 200 grams. In yet a more preferred embodiment of the present invention the computing device has a weight that does not exceed 150 grams. In yet a more preferred embodiment of the present invention the computing device has a weight that does not exceed 100 grams. In one embodiment of the present invention the weight of the computing device is diminished by applying a thin battery, not exceeding 5 mm in thickness. In one embodiment of the present invention the weight of the computing device is diminished by applying a thin battery, not exceeding 3 mm in thickness.
- In one embodiment, a computing device recognizes when an external display is connected to it. Such a computing device may for instance use a simple, menu driven interface to a user on its own integrated screen. It is known that a computing device can be under control of an operating system. Such an operating system in one embodiment of the present invention is operational on a processor in a small footprint computing device. The operating system is programmed to recognize the connection of an external display device. Such recognition capability is for instance known for USB devices. When an external ut/ul display is recognized the operating system through the processor generates a graphical user interface (GUI) for the external display screen, while the integrated display screen on the computing device maintains its own interface. The GUI on the external display screen may be controlled by a control on the computing device. The external ut/ul display may also have its own user control, which may be a touch screen which communicates through the connection with the processor on the computing device.
- In one embodiment the external ut/ul display device depends for most of its processing capabilities and storage capabilities on the computing device. The external display in such an embodiment has capabilities to communicate with the computing device. It may have some processing power to generate appropriate graphics, images and characters. This may include screen drivers, and some control circuitry, power circuitry and some local memory to hold and perhaps recall images or GUI interfaces as well as communication circuitry to communicate with the computing device. However, all actual processing and generation of data, images and the like takes place on the small footprint computing device. This means that only a minimum of circuitry and supporting connectivity, storage and power circuitry is located on the external ut/ul display and limits its thickness and weight. In a further embodiment, one may include additional processing circuitry and memory on the external ut/ul display, allowing to process some data for display and cache data for future use. One may also include processing capabilities to locally generate and hold a GUI on the external display. This means that for instance an external display may be disconnected from the computing device without losing the data or screen information that is being displayed.
- The computing device in one embodiment of the present invention has wireless phone and other capabilities such as camera capabilities and/or document processing capabilities. In one embodiment of the present invention the wireless phone capabilities can be independently switched on and/or off without switching on/off the other capabilities.
- In one embodiment of the present invention, the screen is of a connectable ut/ul display is a screen with touch screen capabilities. This allows the screen to support full video in color while also having touch screen capabilities. Such a light weight display with touch screen capability has been disclosed in US Patent Application 20100171708 to Chuang published on Jul. 8, 2010 which is incorporated herein by reference. It is to be understood that an embodiment of a ut/ul display that uses touch screen technology and/or a control that communicates with a processor on the computing device is enabled to receive and process signals from the computing device and to send signals to the computing device.
-
FIG. 28 is a diagram of an embodiment of the present invention. It shows a portable andconnectable display 2804 with ascreen 2805. Thedisplay 2804 has a connectingdevice 2803 which allows thedisplay 2804 to be connected with a connectingdevice 2802 of acomputing device 2801 which preferably is a small footprint computing device. Thecomputing device 2801 in one embodiment has at least one lens/sensor unit 2807. In a furtherembodiment computing device 2801 in one embodiment has at least two lens/sensor units for creating panoramic and/or 3D images as disclosed herein. In one embodiment these panoramic and/or 3D images are created by a processor on 2801 and are transmitted toportable display 2804 to be displayed onscreen 2805. In a further embodiment the constituting images are being generated by the at least two lens/sensor units on 2801 and are provided with control data that allows a processor on 2804 to combine the images to form a panoramic and/or 3D image by applying the control data and to display the panoramic and/or 3D image onscreen 2805. Theportable display 2804 has acontrol 2806 which may be an on/off switch or one or more controls to control screen functionalities such as brightness, contrast and the like, and which may also contain a mouse or pointer like control to control for instance a cursor or pointer type graphic on thescreen 2805. - In a further embodiment the computing device has at least 2 lens/sensor units of which one, which is 2807 is shown facing the viewer of the
FIG. 28 and at least one facing the opposite site. This allows thedevice 2801 to record images in the direction of a user and an image in the viewing direction of the user. In a further embodiment a lens/sensor unit in the above embodiment is at least a plurality of lens/sensor units that allows to creating panoramic and/or 3D images in each direction. -
FIG. 29 shows a diagram of one embodiment wherein thecomputing device 2801 has aslot 2901 which contains connectingdevice 2902 which can receive and clamp to an edge of 2804 over the connectingdevice 2803 to create a single device consisting of 2801 and 2802. Thecomputing device 2801 withslot 2901 is shown in side view inFIG. 29 . A lens of lens/sensor unit 2807 is shown in the side view. Alens 2903 in an opposing direction is also shown. It is to be understood that in one embodiment of the present invention that either 2807 or 2903 or both 2807 and 2903 are a plurality of lens/sensor units. -
FIG. 29 also shows a diagram of the combineddevices screen 2805 as a direct viewer of either lens/sensor unit device 2900 records and shows images taken both by 2807 and 2903 at the same time on 2805. This may be done by displaying each respective image in a separate window on 2805. One may also show at least one image which may be a video image received by the phone device indevice 2801 from a network, which may be the Internet on 2805. One may also receive a plurality of different images at the same time through the phone device. One may open separate windows for each received image and display the images which may be video images on the screen. One may also transmit an image taken bydevice 2801 through the network to recipients. In a further embodiment at least one window on the screen may be a window related to an application that is being processed by a processor on 2801. This window may also be transmitted through the phone device on 2801 over the network which may be the internet to one or more recipients. Accordingly, 2900 can be used as a portable teleconferencing device. It is to be understood that the functionality of 2900 can also be performed by 2804 and 2801 when not combined but in communication through connectingdevices - It is also to be understood that while 2801 in
FIG. 29 is shown being clamped on 2804 on the left edge that other configurations are possible, including the computing device being clamped on the right edge, on the top edge, one the bottom edge and on the back side of thedisplay 2804. -
FIG. 30 shows in side view diagram one embodiment of acomputing device 3001 that can be clamped onto aconnectable display 3004 able to connect withcomputing device 3001.Computing device 3001 has, as was disclosed above, aslot 2901 which can receive the side of thedisplay 3004. In one embodiment of the present invention,computing device 3001 has aconnector 2902 which was also disclosed above anddisplay 3004 has aconnector 3005.Connectors present invention computing device 3001 to power thedisplay 3004. In one embodiment the power to a display is exclusively provided externally and the display does not have a battery. Theslot 2901 may be equipped withsprings 3003 which further secure the display in the slot. Other means to secure the display in a slot, including alignment slots are fully contemplated. Side clamping with a slot has been provided as an illustrated example. Other means for connecting and fixating acomputing device 3001 to adisplay 3004 are fully contemplated. For instance the display may receive the body of the computing device on the blind side. In another illustrative example the computing device may have a hook shaped connector that hooks over and edge of the display housing, with a connector inside the hook shape of the computing device and a connector on the edge of the display. In a further embodiment of thepresent invention - The computing device and the connectable ut/ul display are two different devices that can be connected and/or combined. One way to not exclusively describe a combination of a computing device and a display is by size of their screens, wherein the screen of the display is larger than of the device it connects to. Another way to not exclusively describe an aspect of the invention is that no dimension in size of the computing device matches or is equal or substantially equal to the dimensions of the display with the condition that the screen of the display is larger than the screen of the computing device. The computing device housing has a length Lc (its longest dimension) a width Wc (its width) and a thickness Tc wherein all dimensions refer to the housings. These can be referred to as (Lc,Wc,Tc). A display, in a similar way, has housing dimensions (Ld, Wd, Td). In a first embodiment none of the corresponding dimensions (Lc,Wc,Tc) and (Ld, Wd, Td) are equal (Lc≠Ld) etc. In a further embodiment none of the dimensions (Lc,Wc,Tc) are equal to any of the dimensions (Ld, Wd, Td). In a further embodiment Lc<Ld and Wc<Wd. In a further embodiment Tc>Td.
- In a further embodiment the display can connect preferably wirelessly with any computing device, including communication type devices such as routers to receive a signal with image data to be displayed, that is authorized. Such an authorization may be established by an authorization code or a way of coding or modulation as is known to one of ordinary skill in the art that allows an apparatus to be identified to be authorized. In a further embodiment, the display is enabled to send a probing signal wirelessly that can be recognized by a computing device. The computing device is programmed to recognize a specific signal, associate the specific signal with a specific authorization and transmits data, including image data to the display in response to a recognized probing signal. In a further embodiment the process starts with a computing device sending a signal that can be received and interpreted by the display. This signal tells the display that a computing device is present and ready to receive a probing signal from the display. This allows a display to be traced by a computing device when it is being moved and keep providing the display with appropriate data, which may include image data.
- In a further embodiment a plurality of displays are static in location and placed in different positions and a first computing device is mobile, with a moving user for instance. A user may for instance be watching a video program that is stored on a second computing device on a first display and that is provided over a network to the first display. Relevant data about the video program is provided and stored on the first computing device. The user may move to a different location with a second display. The first computing device in communication with the second display communicates the presence of the user and provides the second display with the required data for the second display to show the program that was being watched on the first display with the correct timing and status data. This allows a user to move around a house and follow a video program continuously on different displays at different locations.
- One may also apply a similar approach to computer applications. That is: a user may be working on one computer application at one display. All status information may be stored on a mobile first computing device that moves with the user or on a second computing device which is a server connected to a network. A status may be assigned a code which is stored on the first mobile computing device. The user, when moving to a different location with another display brings the first computing device. The first computing device interacts via the network with the second computing device and interprets the code associated with the status that is provided by the first computing device. When the application is processed from the server or second computing device it now automatically displays the correct interface associated with the status of the application. If the application is processed from the first computing device the second display starts displaying the correct interface based on the last status of the application.
- In one embodiment the large connectable ultra-thin display is provided with a lens/sensor unit. In one embodiment the display has a thickness of 7 mm or less and has a weight of less than 16 ounces, preferably a weight of 12 ounces or less and a size of about 25 cm by 20 cm or larger. The lens/sensor unit can be a lens/sensor unit as disclosed by for instance Aptina Imaging of San Jose, Calif. in a wafer level camera form and with a claimed thickness of 2.5 mm. Another company offering a similar wafer level camera called NemoCAM-02AVG is Nemotek of Morroco. Its website describes the unit having a thickness of 2.6 mm. The datasheet of the NemoCAM-02AVG is incorporated herein by reference. A wafer level camera is also taught in US Patent Appl. Publ. Ser. No. 20090309177 to Jeung et al. published on Dec. 17, 2009 which is incorporated herein by reference. By using reflow soldering it is claimed by manufacturers that the wafer level camera can be soldered to a flexible PCB. This may add about 0.5 mm at least to the thickness of the camera. It is believed that the Nemotek camera module can be installed in a display that is 7 mm thick, as a housing can be created that has about 5-6 mm space if an ultrathin housing is created, for instance made from a high strength nanocarbon fiber reinforced shell. The shell can be reinforced with a structural foam.
- It seems that an ultrathin display of about 5 mm thick or less may have trouble accommodating the wafer level camera module. This can be alleviated by forming a hole in the shell that receives the upper part of the wafer level camera. It appears from the literature and disclosures that are available at the time this aspect of the invention was made that no device having a thickness of 5 mm or less has been contemplated to incorporate a camera that is embedded completely inside the housing that is not thicker than 5 mm. Clearly, one can incorporate a larger camera, if it is allowed to stick out of the housing. In one embodiment of the present invention a ut/ul display having a thickness ‘d’ or less means that its greatest thickness at any location is not greater than ‘d’. In such an embodiment if its general thickness is ‘a’ and a part sticks out by an amount of ‘b’, then the thickness is determined to be ‘a’+‘b’. So, with a thickness of 5 mm or less in such an embodiment the camera module has to fit within a housing that has not greater thickness than 5 mm anywhere.
- In accordance with one aspect of the present invention one may provide such a wafer level camera module with a much smaller backplane rather than a Ball Grid Array (BGA). This is shown in
FIG. 34 wherein amodule 3401 is attached to aBGA connector 3402 in aunit 3400. This connector still has to be soldered to a PCB, all adding to the thickness. In one embodiment of the present invention all connections, including power are provided through athin backplane 3403 of about 0.5 mm thick in a first embodiment, but not thicker than 1 mm in a second embodiment. Thebackplane 3403 may have aBGA connector 3406 or any slim connector facing to the top of the camera module. Assuming that the thickness of the shell of the housing of the display is about 0.5 mm theconnector 3407 can be about 2 mm thick without interfering with the shell. In one embodiment one applies wireless chip-to-chip communication between the sensor of the camera module and aconnector chip 3407. Wireless chip-to-chip communication is enabled in for instance US Patent Application Publ. Ser. No. 20090175323 to Chung published on Jul. 9, 2009 and in the article RF/Wireless Interconnect for Inter- and Intra-Chip Communications by Mau-Chung Frank Chang et al. Proceedings of the IEEE, Vol. 89, no. 4, April 2001 pages 456-466 which are both incorporated herein by reference. This requires that the sensor chip ofmodule 3401 is expanded with the communication part, which increases the size of the die. However, in this case the area of the chip is not as critical as the thickness of the chip or the backplane. By using the wireless connection, the backplane can be drastically reduced in size. Themodule 3401 can be provided with power through a verythin power rail 3403. The receiver chip is located in 3407. While the same diagram ofFIG. 34 is applied it is noted that specific components in this embodiment have a different function from the earlier provided embodiment in diagram ofFIG. 34 . - In a further embodiment of the present invention, if space allows it
thin flanges module 3401 against the surface of the housing. Preferably the flanges are 0.5 mm or less thick.FIG. 35 shows in diagram how thecamera unit 3405 in one embodiment is fitted against the underside of thesurface 3502 of the shell of the housing of the display.FIG. 35 shows a diagram both in cross-sectional view and in top view. -
FIG. 31 shows in diagram one embodiment of adisplay body 3100 with ascreen 3105 and acontrol 3106 and at least one lens/sensor unit 3101 and aconnector 3103 to connect the display to the outside world. It is to be understood that related and required circuitry and connections as is known to one of ordinary skill are included in the body. In a further embodiment at least one lens/sensor unit 3201 in adisplay body 3200 is fitted into the body at the side of the display at the other side of the screen. Acontrol 3107 may be included on that side. In a further embodiment of the present invention the display includes at least one lens/sensor unit 3101 at one side and at least one lens/sensor unit 3201 on the other side of the body of the ut/ul display. In a further embodiment of the present invention the ut/ul display includes at least two lens/sensor unit 3101 at one side and/or at least two lens/sensor units 3201 on the other side of the body of the ut/ul display. In a further embodiment of the present invention both at least one lens/sensor units are operated at the same time to create an image which may be a video image. It should be clear that if a body of the display has a thickness of about 5 mm or less then lens/sensor units FIG. 33 shows a diagram of an embodiment wherein one side of thedisplay 3300 has at least two lens/sensor units - One can accommodate a lens/sensor unit of a thickness, complete with connectors of about 3 mm in a display with a thickness of 3 mm if part of the camera is allowed to stick out of the housing. This is illustrated in
FIG. 36 . Awafer level camera 3605 is fixed with its lens barrel through thewall 3603 and fit tightly with its backplane against the inside of the wall.Protective material 3603 to protect the barrel may be applied. - It is to be understood that an ut/ul display as provided herein, be it a detached, attached or detachable display displays images as generated by the computing device in real-time or close to real-time, with a delay that is in general not significantly noticeable by a user. One may use in one embodiment of the present display screen or screen material that has no or limited latency and can show images such as video in real-time with a speed of at least 12 images or frames per second in one embodiment, at least 20 images per second in a more preferred embodiment, 30 images per second in an even more preferred embodiment and at least 50 images or frames per second in a most preferred embodiment and shows images in full color. Accordingly, real-time video in one embodiment is video displaying at least 12 frames per second.
- In a further embodiment of the present invention one may use material such as ePaper or elnk that has considerable latency and may show images at a speed of 10 images or frames per second or even less. Images may be shown in one embodiment of the present invention in shades of grey. It was already described previously that different materials to display images in accordance with an aspect of the present invention are available.
- To be convenient the ut/ul display must be fairly large to serve its purpose as a large screen, thin and ultra light. It is believed that if a display does not meet certain dimensional criteria it will not serve its purpose as a detached portable large screen. It should have a screen of diagonal dimension of at least 10 cm, more preferably have a screen of diagonal dimension of at least 15 cm or preferably larger and preferably have a diagonal dimension of 20 cm, preferably even larger. Preferably it has a thickness of not more than 10 mm, more preferably not more than 7 mm, more preferable not more than 5 mm. Further, preferably a thickness should be less than 5 mm, preferably 3 mm or less, preferably 2 mm. The weight of the display should be preferably be less than 300 gram, in a further embodiment of the present invention less than 200 gram and in yet a further embodiment less than 100 gram. Preferably, the portable and self contained ut/ul display should be of the size, weight, and thickness of a magazine that one would carry without being bothered by its weight. One should be able to carry the ut/ul display in a bag or brief case, such as one would carry a magazine during travel. In one embodiment the ut/ul display has a screen that is a touch screen and wherein the display can communicate in a two way manner with the computing device. In a further embodiment the ut/ul display is connectable to a small footprint device that in a stand alone mode is mainly a mobile phone, but when connected with the display provided with touch screen capabilities is a full blown operational computing device, enabled to perform computing applications such as the full Microsoft Office® Suite of applications (Outlook, Word, Powerpoint, Excel, and Access), games, Internet browsing and the like.
- Such a computing device in one embodiment has at least a processor, mass storage, display processing capability, communication capability, input and output capabilities and a minimal display, with an operating system and application software stored on the device while measuring not over 2 cm by 3 cm by 10 cm3 in volume, preferably not more than 2 by 3 by 6 cm3 in volume.
- At the time that this invention was made one available display would be an eReader. However, eReaders currently are not available as a separate portable display for a computing device with preferably at least one lens/sensor unit. In fact, one can conclude that eReaders (such as the Kindle II marketed by Amazon) are too small, too thick and too heavy to meet the requirements of the ut/ul display as defined above. eReaders are also not geared to display images in real-time, such as video images, generated from a lens/sensor unit in the computing device, which may be a smart phone with a camera.
- One can find reviews of commercially available eReaders on the Internet. Such a review may be found at for instance <http://ebook-reader-review.toptenreviews.com/>. The eReaders are relatively small, thick and heavy. The battery as used in these eReaders cause the eReaders to be heavy and thick.
- A separate and portable ut/ul display, which is preferably self contained and is enabled to communicate preferably wirelessly with a computing device such as a cell phone is provided herein as one aspect of the present invention. In one embodiment of the present invention the display is ultra thin. For instance in one embodiment the display is an ultra thin ePaper or elnk like display in a thin housing, and in its entirety not thicker than 5 mm. In a further embodiment the display with a portable housing has a thickness of 3 mm or thinner. The display screen material itself, for instance as announced by E.Ink corporation of Cambridge, Mass. is about 2 mm or thinner. Additional thickness or bulk is provided by a housing and in particular by a ultra-thin battery to power the display.
- It should be clear that the large display as provided as an aspect of the present invention is very well enabled to display a panoramic image. The screen material in a further embodiment of the present invention is enabled to display 3D images.
- An external and separate mobile and portable display as disclosed herein works with a mobile and portable computing device. A computing device has at least a processor, memory, it has preferably but not necessarily a display that is smaller than an external display and a transmitter to send a signal to the external display and least one lens/sensor unit enabled to record a digital image and transmit the image to the external display in real-time or close to real-time, allowing a user to watch in real-time or close to real-time a video image in color recorded by the computing device. In one embodiment of the present invention the computing device with at least one lens/sensor unit is a general computer working under an operating system and enabled to execute standard computer application software such as word-processing, e-mail, browsing, games and the like. In one embodiment of the present invention the computing device with at least one lens/sensor unit is a mobile phone, enabled to connect wirelessly to a mobile phone network. In one embodiment of the present invention the computing device with at least one lens/sensor unit as at least a mobile phone, enabled to connect wirelessly to a mobile phone network and a camera and is called a mobile camera phone. In one embodiment of the present invention the computing device with at least one lens/sensor unit is a tablet PC with a screen smaller than the external display. In one embodiment of the present invention the computing device with at least one lens/sensor unit is a smart phone, or PDA, enabled to connect wirelessly to a mobile phone network. In one embodiment of the present invention the computing device with at least one lens/sensor unit is a stand-alone camera. In a further embodiment of the present invention a computing device has at least two lens/sensor units and is enabled to create in real-time a single image such as a video image that is created from at least two separate images.
- The following patent applications, including the specification, claims and drawings, are hereby incorporated by reference herein, as if they were fully set forth herein: U.S. Non-Provisional patent application Ser. No. 12/435,624 filed on May 5, 2009 and U.S. Non-Provisional patent application Ser. No. 12/436,874 filed on May 7, 2009, U.S. Non-Provisional patent application Ser. No. 12/538,401 filed on Aug. 10, 2009, and U.S. Non-Provisional patent application Ser. No. 12/634,058 filed on Dec. 9, 2009.
- While there have been shown, described and pointed out fundamental novel features of the invention as applied to preferred embodiments thereof, it will be understood that various omissions and substitutions and changes in the form and details of the device illustrated and in its operation may be made by those skilled in the art without departing from the spirit of the invention. It is the intention, therefore, to be limited only as indicated by the scope of the claims appended hereto.
Claims (20)
1. A large, thin, portable and mobile display communicatively connectable to a portable and mobile computing device, comprising:
a housing, having a thickness not greater than 7 mm;
a receiver to receive a signal generated by the computing device;
a processor authorized to process the signal generated by the computing device;
a screen in the housing enabled to display a video image in real-time in response to the signal generated by the portable computing device received by the display, the screen having a size not smaller in area than 300 cm2; and
the display having a weight that does not exceed 12 ounces.
2. The display of claim 1 , wherein the housing has a thickness not greater than 5 mm.
3. The display of claim 1 , wherein the housing has a thickness not greater than 4 mm.
4. The display of claim 1 , wherein the screen has a size not smaller in area than 500 cm2.
5. The display of claim 1 , wherein a communication connection with the portable computing device is a wireless connection.
6. The display of claim 1 , wherein a communication connection with the portable computing device is a wired connection.
7. The display of claim 6 , wherein the portable computing device powers the display through the wired connection.
8. The display of claim 1 , wherein the computing device has a volume that is not larger in volume than 50 cm3.
9. The display of claim 1 , wherein the computing device is a stand alone camera.
10. The display of claim 1 , wherein the computing device is a mobile camera phone.
11. The display of claim 1 , further comprising:
a positional sensor to generate a signal corresponding to a change of position of the display;
a portable and mobile platform that is enabled to be removably attached to a structure or an object, the platform containing:
a holder to hold the computing device; and
at least one actuator that is controlled by a signal corresponding to the signal generated by the positional sensor.
12. A computing system, comprising:
a mobile computing device containing a processor, data memory, at least one lens/sensor unit, and a communication interface;
a mobile computer display separate from the mobile computing device with a screen size that is not smaller in area than 300 cm2 enabled to display a video image in color in real-time generated by the at least one lens/sensor unit, the computer display having a body with a thickness not exceeding 7 mm and a communication interface; and wherein
the communication interface of the mobile computing device is enabled to communicate directly and uniquely with the communication interface of the mobile computer display.
13. The computing system of claim 12 , wherein the communication interface of the mobile computing device is enabled to communicate wirelessly with the communication interface of the mobile computer display.
14. The computing system of claim 12 , wherein the communication interface of the mobile computing device is enabled to communicate via a wired connection with the communication interface of the mobile computer display to provide power from the mobile computing device to the mobile computing display.
15. The computing system of claim 12 , wherein the mobile computing device is a stand alone camera.
16. The computing system of claim 12 , wherein the mobile computing device is a mobile camera phone.
17. The computing system of claim 12 , wherein the mobile computing device contains at least two lens/sensor units enabled to generate a single image composed from at least two images.
18. The computing system of claim 12 , further comprising:
a mobile platform rotatable in at least one plane of rotation that is separate from the mobile computer display, that is enabled to hold the mobile computing device and that is attachable to an object or structure and; and
an actuator in the mobile platform that is controlled by a positional sensor attached to the mobile computer display.
19. The computing system of claim 18 , further comprising:
a processor programmed to track an image recorded by the mobile computing device of a moving object and programmed to control the actuator in the mobile platform to maintain an image of the object on the screen of the mobile computer display.
20. A mobile platform, comprising:
a fixture enabled to be removably attached to an object or a structure;
a housing containing circuitry to process a signal associated with a signal generated by a positional sensor attached to a remote computer display with a screen;
at least one remotely controlled actuator;
an arm that is enabled to be rotatably moved by the at least one remotely controlled actuator based on the signal associated with the signal generated by the positional sensor attached to the remote computer display; and
a device holder attached to the arm enabled to hold a computing device that includes at least one lens/sensor unit that is enabled to provide a video image to be displayed in real-time on the screen of the remote display.
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/983,168 US20110098083A1 (en) | 2008-05-19 | 2010-12-31 | Large, Ultra-Thin And Ultra-Light Connectable Display For A Computing Device |
US13/844,626 US9171221B2 (en) | 2010-07-18 | 2013-03-15 | Camera to track an object |
US15/829,837 US10261408B2 (en) | 2010-07-18 | 2017-12-01 | Mobile and portable camera platform for tracking an object |
US15/836,815 US10331024B2 (en) | 2008-05-19 | 2017-12-08 | Mobile and portable screen to view an image recorded by a camera |
US16/011,319 US10585344B1 (en) | 2008-05-19 | 2018-06-18 | Camera system with a plurality of image sensors |
US16/423,357 US10831093B1 (en) | 2008-05-19 | 2019-05-28 | Focus control for a plurality of cameras in a smartphone |
US16/814,719 US11119396B1 (en) | 2008-05-19 | 2020-03-10 | Camera system with a plurality of image sensors |
US17/472,658 US20210405518A1 (en) | 2008-05-19 | 2021-09-12 | Camera system with a plurality of image sensors |
Applications Claiming Priority (12)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US5429008P | 2008-05-19 | 2008-05-19 | |
US5527208P | 2008-05-22 | 2008-05-22 | |
US8972708P | 2008-08-18 | 2008-08-18 | |
US10602508P | 2008-10-16 | 2008-10-16 | |
US10676808P | 2008-10-20 | 2008-10-20 | |
US12/435,624 US8164655B2 (en) | 2008-05-19 | 2009-05-05 | Systems and methods for concurrently playing multiple images from a storage medium |
US12/538,401 US8355042B2 (en) | 2008-10-16 | 2009-08-10 | Controller in a camera for creating a panoramic image |
US12/634,058 US20100097444A1 (en) | 2008-10-16 | 2009-12-09 | Camera System for Creating an Image From a Plurality of Images |
US29186110P | 2010-01-01 | 2010-01-01 | |
US32287510P | 2010-04-11 | 2010-04-11 | |
US36534710P | 2010-07-18 | 2010-07-18 | |
US12/983,168 US20110098083A1 (en) | 2008-05-19 | 2010-12-31 | Large, Ultra-Thin And Ultra-Light Connectable Display For A Computing Device |
Related Parent Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/435,624 Continuation-In-Part US8164655B2 (en) | 2008-05-19 | 2009-05-05 | Systems and methods for concurrently playing multiple images from a storage medium |
US12/634,058 Continuation-In-Part US20100097444A1 (en) | 2008-05-19 | 2009-12-09 | Camera System for Creating an Image From a Plurality of Images |
US16/814,719 Continuation-In-Part US11119396B1 (en) | 2008-05-19 | 2020-03-10 | Camera system with a plurality of image sensors |
Related Child Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/844,626 Continuation-In-Part US9171221B2 (en) | 2008-05-19 | 2013-03-15 | Camera to track an object |
US15/829,837 Continuation US10261408B2 (en) | 2010-07-18 | 2017-12-01 | Mobile and portable camera platform for tracking an object |
US15/836,815 Continuation US10331024B2 (en) | 2008-05-19 | 2017-12-08 | Mobile and portable screen to view an image recorded by a camera |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110098083A1 true US20110098083A1 (en) | 2011-04-28 |
Family
ID=43898890
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/983,168 Abandoned US20110098083A1 (en) | 2008-05-19 | 2010-12-31 | Large, Ultra-Thin And Ultra-Light Connectable Display For A Computing Device |
US15/836,815 Expired - Fee Related US10331024B2 (en) | 2008-05-19 | 2017-12-08 | Mobile and portable screen to view an image recorded by a camera |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/836,815 Expired - Fee Related US10331024B2 (en) | 2008-05-19 | 2017-12-08 | Mobile and portable screen to view an image recorded by a camera |
Country Status (1)
Country | Link |
---|---|
US (2) | US20110098083A1 (en) |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110304580A1 (en) * | 2010-06-11 | 2011-12-15 | Fih (Hong Kong) Limited | Input device using projector |
US20120115543A1 (en) * | 2010-11-05 | 2012-05-10 | Hon Hai Precision Industry Co., Ltd. | Head mounted display apparatus with phone function |
CN102625023A (en) * | 2011-01-31 | 2012-08-01 | 索尼公司 | Information processing apparatus, information processing method, program, and imaging apparatus |
US20120320180A1 (en) * | 2008-12-30 | 2012-12-20 | May Patents Ltd. | Electric shaver with imaging capability |
US20130237811A1 (en) * | 2012-03-07 | 2013-09-12 | Speir Technologies Inc. | Methods and systems for tracking and guiding sensors and instruments |
US20130250041A1 (en) * | 2012-03-26 | 2013-09-26 | Altek Corporation | Image capture device and image synthesis method thereof |
US20140098193A1 (en) * | 2012-10-05 | 2014-04-10 | Research In Motion Limited | Methods and devices for generating a stereoscopic image |
US20140096642A1 (en) * | 2012-10-05 | 2014-04-10 | Remy Technologies, Llc | Starter motor |
US20140098197A1 (en) * | 2012-10-05 | 2014-04-10 | Research In Motion Limited | Methods and devices for generating a stereoscopic image |
US20140179384A1 (en) * | 2011-08-04 | 2014-06-26 | Zvi Minkovitch | Method, system and apparatus for managing a football match |
US8792189B2 (en) | 2012-01-26 | 2014-07-29 | Bibek Sahu | Personal media viewing device |
DE102013006330A1 (en) * | 2013-04-12 | 2014-10-16 | GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) | Method and device for determining dimensions of a vehicle |
EP2860961A1 (en) * | 2013-10-11 | 2015-04-15 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
US20150181197A1 (en) * | 2011-10-05 | 2015-06-25 | Amazon Technologies, Inc. | Stereo imaging using disparate imaging devices |
US20150201193A1 (en) * | 2012-01-10 | 2015-07-16 | Google Inc. | Encoding and decoding techniques for remote screen sharing of media content using video source and display parameters |
US20150249794A1 (en) * | 2010-10-05 | 2015-09-03 | Sony Computer Entertainment Inc. | Apparatus and method for displaying images |
US20150296174A1 (en) * | 2014-03-26 | 2015-10-15 | Bosh Technologies Llc | Modular display system |
WO2016025962A1 (en) * | 2014-08-15 | 2016-02-18 | The University Of Akron | Device and method for three-dimensional video communication |
CN105468320A (en) * | 2015-11-16 | 2016-04-06 | 歌尔声学股份有限公司 | Black and white screen display method and device based on Android platform and intelligent terminal |
US20160148075A1 (en) * | 2014-11-20 | 2016-05-26 | Oberthur Technologies | Method and device for authentification of a user |
US20160234477A1 (en) * | 2015-02-06 | 2016-08-11 | Tara Chand Singhal | Apparatus and method for 3d like camera system in a handheld mobile wireless device |
US20160295121A1 (en) * | 2015-04-03 | 2016-10-06 | Canon Kabushiki Kaisha | Accessory for camera and storage medium storing display control program |
US20170011732A1 (en) * | 2015-07-07 | 2017-01-12 | Aumed Corporation | Low-vision reading vision assisting system based on ocr and tts |
US20170034433A1 (en) * | 2013-03-14 | 2017-02-02 | Futurewei Technologies, Inc. | Camera Augmented Reality Based Activity History Tracking |
US20170353636A1 (en) * | 2016-06-03 | 2017-12-07 | Ancil Bishop | Tri-lens image capturing device |
US9889367B2 (en) | 2013-03-18 | 2018-02-13 | Zvi Minkovitch | Sports match refereeing system |
US10334203B2 (en) * | 2016-11-07 | 2019-06-25 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Thin-profile television device |
CN110196615A (en) * | 2018-02-26 | 2019-09-03 | 广东欧珀移动通信有限公司 | A kind of control method of functional module group, electronic equipment and readable storage medium storing program for executing |
US10617401B2 (en) | 2014-11-14 | 2020-04-14 | Ziteo, Inc. | Systems for localization of targets inside a body |
WO2021052252A1 (en) * | 2019-09-17 | 2021-03-25 | 深圳市万普拉斯科技有限公司 | Camera raising and retracting control method and apparatus, and mobile terminal |
US11070712B2 (en) * | 2019-08-30 | 2021-07-20 | Puwell Technology Llc | Method and system for control of a digital camera system |
US20210289190A1 (en) * | 2020-01-09 | 2021-09-16 | Jerry Nims | 2d image capture system & display of 3d digital image |
US20210297647A1 (en) * | 2020-01-09 | 2021-09-23 | Jerry Nims | 2d image capture system, transmission & display of 3d digital image |
US20210321077A1 (en) * | 2020-01-09 | 2021-10-14 | JUC Holdings Limited | 2d digital image capture system and simulating 3d digital image sequence |
US20220051427A1 (en) * | 2020-01-09 | 2022-02-17 | JUC Holdings Limited | Subsurface imaging and display of 3d digital image and 3d image sequence |
US20220078392A1 (en) * | 2020-01-09 | 2022-03-10 | JUC Holdings Limited | 2d digital image capture system, frame speed, and simulating 3d digital image sequence |
US11439358B2 (en) | 2019-04-09 | 2022-09-13 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
US20220385880A1 (en) * | 2021-05-28 | 2022-12-01 | Jerry Nims | 2d digital image capture system and simulating 3d digital image and sequence |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110198438A (en) * | 2019-07-05 | 2019-09-03 | 浙江开奇科技有限公司 | Image treatment method and terminal device for panoramic video image |
Citations (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4782402A (en) * | 1986-03-14 | 1988-11-01 | Pioneer Electronic Corporation | Video disk with multiplexed video and digital information |
US4866430A (en) * | 1986-12-11 | 1989-09-12 | Motorola, Inc. | Low voltage LED driver circuit |
US5257324A (en) * | 1991-11-01 | 1993-10-26 | The United States Of America As Represented By The Secretary Of The Navy | Zero-time-delay video processor circuit |
US5343243A (en) * | 1992-01-07 | 1994-08-30 | Ricoh Company, Ltd. | Digital video camera |
US5568192A (en) * | 1995-08-30 | 1996-10-22 | Intel Corporation | Method and apparatus for processing digital video camera signals |
US5646679A (en) * | 1994-06-30 | 1997-07-08 | Canon Kabushiki Kaisha | Image combining method and apparatus |
US5680649A (en) * | 1994-06-29 | 1997-10-21 | Seiko Precision Inc. | Lens driving device for auto-focus camera |
US5947359A (en) * | 1997-06-25 | 1999-09-07 | Kojima Press Industry Co., Ltd. | Device for holding a portable telephone |
US6043310A (en) * | 1994-05-31 | 2000-03-28 | General Electric Company | Thin-wall housing |
US6178144B1 (en) * | 1998-06-02 | 2001-01-23 | Seagate Technology Llc | Magneto-optical recording system employing linear recording and playback channels |
US6282320B1 (en) * | 1997-03-19 | 2001-08-28 | Sony Corporation | Video data decoding apparatus and method and video signal reproduction apparatus and method |
US6323858B1 (en) * | 1998-05-13 | 2001-11-27 | Imove Inc. | System for digitally capturing and recording panoramic movies |
US6445833B1 (en) * | 1996-07-18 | 2002-09-03 | Sanyo Electric Co., Ltd | Device and method for converting two-dimensional video into three-dimensional video |
US6574417B1 (en) * | 1999-08-20 | 2003-06-03 | Thomson Licensing S.A. | Digital video processing and interface system for video, audio and ancillary data |
US6669346B2 (en) * | 2000-05-15 | 2003-12-30 | Darrell J. Metcalf | Large-audience, positionable imaging and display system for exhibiting panoramic imagery, and multimedia content featuring a circularity of action |
US6696198B2 (en) * | 1998-12-10 | 2004-02-24 | Matsushita Electric Industrial Co., Ltd. | Flat battery |
US6727941B1 (en) * | 2000-08-18 | 2004-04-27 | The United States Of America As Represented By The Secretary Of The Navy | Universal digital camera controller with automatic iris tuning |
US20040257436A1 (en) * | 1997-04-21 | 2004-12-23 | Sony Corporation | Controller for photographing apparatus and photographing system |
US6885374B2 (en) * | 2001-06-29 | 2005-04-26 | Intel Corporation | Apparatus, method and system with a graphics-rendering engine having a time allocator |
US20050122400A1 (en) * | 2003-08-13 | 2005-06-09 | Topcon Corporation | Photographic apparatus with function of image correction and method thereof |
US6961055B2 (en) * | 2001-05-09 | 2005-11-01 | Free Radical Design Limited | Methods and apparatus for constructing virtual environments |
US6972796B2 (en) * | 2000-02-29 | 2005-12-06 | Matsushita Electric Industrial Co., Ltd. | Image pickup system and vehicle-mounted-type sensor system |
US6975308B1 (en) * | 1999-04-30 | 2005-12-13 | Bitetto Frank W | Digital picture display frame |
US6992733B1 (en) * | 1997-04-11 | 2006-01-31 | Micron Technology, Inc. | Backlighting system for an LCD |
US20060061951A1 (en) * | 2003-05-29 | 2006-03-23 | Fujitsu Limited | Method for controlling computer system having wireless display and computer system |
US7020888B2 (en) * | 2000-11-27 | 2006-03-28 | Intellocity Usa, Inc. | System and method for providing an omnimedia package |
US20060164883A1 (en) * | 2005-01-25 | 2006-07-27 | Peter Lablans | Multi-valued scrambling and descrambling of digital data on optical disks and other storage media |
US7085484B2 (en) * | 2002-07-17 | 2006-08-01 | Minolta Co., Ltd. | Driving device, position controller provided with driving device, and camera provided with position controller |
US7102686B1 (en) * | 1998-06-05 | 2006-09-05 | Fuji Photo Film Co., Ltd. | Image-capturing apparatus having multiple image capturing units |
US7123745B1 (en) * | 1999-11-24 | 2006-10-17 | Koninklijke Philips Electronics N.V. | Method and apparatus for detecting moving objects in video conferencing and other applications |
US7126897B2 (en) * | 2002-03-18 | 2006-10-24 | Ricoh Company, Ltd. | Multi-level information recording apparatus, multi-level information recording method, multi-level information recording medium and multi-level information recording-reproducing apparatus |
US7136633B2 (en) * | 1996-10-08 | 2006-11-14 | Cingular Wireless Ii, Llc | Method and apparatus for over-the-air programming of telecommunication services |
US7149178B2 (en) * | 1999-02-18 | 2006-12-12 | Lsi Logic Corporation | Method and format for reading and writing in a multilevel optical data systems |
US20070010200A1 (en) * | 2005-07-07 | 2007-01-11 | Kabushiki Kaisha Toshiba | Wireless communication device |
US7167155B1 (en) * | 1995-07-20 | 2007-01-23 | E Ink Corporation | Color electrophoretic displays |
US20070031062A1 (en) * | 2005-08-04 | 2007-02-08 | Microsoft Corporation | Video registration and image sequence stitching |
US20070035516A1 (en) * | 2005-08-09 | 2007-02-15 | Delphi Technologies, Inc. | Joystick sensor with light detection |
US7215144B2 (en) * | 2004-05-20 | 2007-05-08 | International Business Machines Corporation | Pre-emphasis driver with constant impedance |
US20070120979A1 (en) * | 2005-11-21 | 2007-05-31 | Microsoft Corporation | Combined digital and mechanical tracking of a person or object using a single video camera |
US7227893B1 (en) * | 2002-08-22 | 2007-06-05 | Xlabs Holdings, Llc | Application-specific object-based segmentation and recognition system |
US7259792B2 (en) * | 2002-08-29 | 2007-08-21 | Sony Corporation | Optical system controller for video camera |
US7280745B2 (en) * | 2001-06-15 | 2007-10-09 | Stmicroelectronics Sa | Process and device for managing the memory space of a hard disk, in particular for a receiver of satellite digital television signals |
US20070248260A1 (en) * | 2006-04-20 | 2007-10-25 | Nokia Corporation | Supporting a 3D presentation |
US20070247697A1 (en) * | 2003-11-05 | 2007-10-25 | E Ink Corporation | Electro-optic displays, and materials for use therein |
US20080002023A1 (en) * | 2006-06-30 | 2008-01-03 | Microsoft Corporation Microsoft Patent Group | Parametric calibration for panoramic camera systems |
US20080024614A1 (en) * | 2006-07-25 | 2008-01-31 | Hsiang-Tsun Li | Mobile device with dual digital camera sensors and methods of using the same |
US20080024594A1 (en) * | 2004-05-19 | 2008-01-31 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
US20080024596A1 (en) * | 2006-07-25 | 2008-01-31 | Hsiang-Tsun Li | Stereo image and video capturing device with dual digital sensors and methods of using the same |
US7340077B2 (en) * | 2002-02-15 | 2008-03-04 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
US7345934B2 (en) * | 1997-08-07 | 2008-03-18 | Sandisk Corporation | Multi-state memory |
US7355444B2 (en) * | 2004-02-25 | 2008-04-08 | Ternarylogic Llc | Single and composite binary and multi-valued logic functions from gates and inverters |
US7365789B2 (en) * | 2002-09-03 | 2008-04-29 | Canon Kabushiki Kaisha | Autofocus method and apparatus |
US7365787B2 (en) * | 2004-02-26 | 2008-04-29 | Research In Motion Limited | Mobile device with integrated camera operations |
US20080106634A1 (en) * | 2006-11-07 | 2008-05-08 | Fujifilm Corporation | Multiple lens imaging apparatuses, and methods and programs for setting exposure of multiple lens imaging apparatuses |
US20080111583A1 (en) * | 2004-02-25 | 2008-05-15 | Peter Lablans | Implementing logic functions with non-magnitude based physical phenomena |
US7389580B2 (en) * | 2000-03-24 | 2008-06-24 | Cymbet Corporation | Method and apparatus for thin-film battery having ultra-thin electrolyte |
US7397690B2 (en) * | 2004-06-01 | 2008-07-08 | Temarylogic Llc | Multi-valued digital information retaining elements and memory devices |
US20080180987A1 (en) * | 2004-02-25 | 2008-07-31 | Peter Lablans | Multi-State Latches From n-State Reversible Inverters |
US7418275B2 (en) * | 2005-04-22 | 2008-08-26 | Su-Ching Yiu | Mobile communication apparatus with rotatable display screen |
US7424175B2 (en) * | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
US7429988B2 (en) * | 2006-03-06 | 2008-09-30 | At&T Intellectual Property I, L.P. | Methods and apparatus for convenient change of display characters on a handheld device |
US20080239170A1 (en) * | 2007-03-26 | 2008-10-02 | Motorola, Inc. | Low profile, high efficiency projector for handheld electronic devices |
US7477919B2 (en) * | 2002-09-19 | 2009-01-13 | Peter Warren | Handheld input/output device providing enhanced user interface for a mobile telephone |
US20090086015A1 (en) * | 2007-07-31 | 2009-04-02 | Kongsberg Defence & Aerospace As | Situational awareness observation apparatus |
US20090175323A1 (en) * | 2008-01-08 | 2009-07-09 | Qualcomm Incorporated | Methods and Devices for Wireless Chip-to-Chip Communications |
US7616248B2 (en) * | 2001-07-17 | 2009-11-10 | Eastman Kodak Company | Revised recapture camera and method |
US20090309177A1 (en) * | 2008-06-17 | 2009-12-17 | Samsung Electro-Mechanics Co., Ltd. | Wafer level camera module and method of manufacturing the same |
US7667765B2 (en) * | 2005-06-17 | 2010-02-23 | Hewlett-Packard Development Company, L.P. | Digital imaging device shutter calibration method and apparatus using multiple exposures of a single field |
US20100050779A1 (en) * | 2008-08-29 | 2010-03-04 | Pushparaj Victor L | Carbon nanotube-based load cells |
US7676150B2 (en) * | 2005-07-11 | 2010-03-09 | Fujifilm Corporation | Image pickup apparatus, image pickup method and image pickup program |
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US20100128015A1 (en) * | 2007-05-24 | 2010-05-27 | Liquavista B.V. | Electrowetting element, display device and control system |
US7741734B2 (en) * | 2005-07-12 | 2010-06-22 | Massachusetts Institute Of Technology | Wireless non-radiative energy transfer |
US20100171708A1 (en) * | 2009-01-08 | 2010-07-08 | Prime View International Co., Ltd. | Touch-control structure for a flexible display device |
US7835736B2 (en) * | 2006-04-03 | 2010-11-16 | Disney Enterprises, Inc. | System and method for initializing a portable communication device within a group at a point of activation |
US7869117B2 (en) * | 2007-08-01 | 2011-01-11 | Samsung Electronics Co., Ltd. | Electrophoretic display device |
US7911493B2 (en) * | 2005-12-16 | 2011-03-22 | Kyocera Corporation | Display accessory for a mobile device |
US20110095999A1 (en) * | 2009-10-23 | 2011-04-28 | Plastic Logic Limited | Electronic document reading devices |
US8180410B2 (en) * | 2008-06-06 | 2012-05-15 | Sandisk Technologies Inc. | Housing and clip assembly for portable electronics device |
US8311513B1 (en) * | 2007-06-27 | 2012-11-13 | ENORCOM Corporation | Automated mobile system |
Family Cites Families (70)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US2117257A (en) | 1935-08-13 | 1938-05-10 | Singer Mfg Co | Rheostat |
US4577827A (en) * | 1982-09-30 | 1986-03-25 | Lois G. Langley | Camera holder apparatus |
JPS6059899U (en) * | 1983-09-30 | 1985-04-25 | 日本ビクター株式会社 | tripod |
WO1994017636A1 (en) * | 1993-01-29 | 1994-08-04 | Bell Communications Research, Inc. | Automatic tracking camera control system |
JPH06273693A (en) * | 1993-03-23 | 1994-09-30 | Sony Corp | Picture display device and picture rotating display device |
US5612732A (en) * | 1993-03-31 | 1997-03-18 | Casio Computer Co., Ltd. | Portable compact imaging and displaying apparatus with rotatable camera |
US6747610B1 (en) * | 1997-07-22 | 2004-06-08 | Sanyo Electric Co., Ltd. | Stereoscopic image display apparatus capable of selectively displaying desired stereoscopic image |
US6421069B1 (en) * | 1997-07-31 | 2002-07-16 | Sony Corporation | Method and apparatus for including self-describing information within devices |
US6069593A (en) * | 1998-02-24 | 2000-05-30 | Motorola, Inc. | Display carrier and electronic display control for multiple displays in a portable electronic device |
US6029073A (en) * | 1998-02-24 | 2000-02-22 | Motorola, Inc | Display carrier with binocular viewing for a portable electronic device |
JPH11266384A (en) | 1998-03-18 | 1999-09-28 | Minolta Co Ltd | Digital camera system |
JPH11266383A (en) * | 1998-03-18 | 1999-09-28 | Minolta Co Ltd | Digital camera system |
US6157379A (en) * | 1998-05-21 | 2000-12-05 | Ericsson Inc. | Apparatus and method of formatting a list for display on a touchscreen |
US6160607A (en) * | 1998-06-01 | 2000-12-12 | Diaconu; Dan Mugur | Camera focus indicator |
US6076978A (en) * | 1998-08-07 | 2000-06-20 | Mcilvenna; Kevin Lee | Remote photography device |
US6392694B1 (en) * | 1998-11-03 | 2002-05-21 | Telcordia Technologies, Inc. | Method and apparatus for an automatic camera selection system |
US8290034B2 (en) * | 1998-12-21 | 2012-10-16 | Zin Stai Pte. In, Llc | Video transmission and display including bit-wise sub-sampling video compression |
US7221395B2 (en) * | 2000-03-14 | 2007-05-22 | Fuji Photo Film Co., Ltd. | Digital camera and method for compositing images |
JP2001352373A (en) * | 2000-06-08 | 2001-12-21 | Nintendo Co Ltd | Display processing system using mobile terminal, mobile terminal used for it, and conversion adaptor used for it |
US6512670B1 (en) * | 2000-10-05 | 2003-01-28 | International Business Machines Corporation | Detachable displays or portable devices |
US7109990B1 (en) * | 2000-11-28 | 2006-09-19 | Palm, Inc. | Circuit and method for temperature compensated contrast |
US20020094845A1 (en) * | 2001-01-16 | 2002-07-18 | Rei Inasaka | Body worn display system |
US7197584B2 (en) * | 2001-01-26 | 2007-03-27 | Dell Products L.P. | Removable personal digital assistant in a dual personal computer/personal digital assistant computer architecture |
US6853809B2 (en) * | 2001-01-30 | 2005-02-08 | Koninklijke Philips Electronics N.V. | Camera system for providing instant switching between wide angle and full resolution views of a subject |
KR100407966B1 (en) * | 2001-08-24 | 2003-12-01 | 엘지전자 주식회사 | System and Method for Controlling Cellular Phone with TV |
JP4785300B2 (en) * | 2001-09-07 | 2011-10-05 | 株式会社半導体エネルギー研究所 | Electrophoretic display device, display device, and electronic device |
US20030061606A1 (en) * | 2001-09-21 | 2003-03-27 | Stephan Hartwig | Method and apparatus for inhibiting functions of an electronic device according to predefined inhibit rules |
AU2002366409A1 (en) * | 2001-12-14 | 2003-06-30 | Koninklijke Philips Electronics N.V. | Method of enabling interaction using a portable device |
US20040202291A1 (en) * | 2002-08-27 | 2004-10-14 | Skinner Davey Nyle | Mobile phone with voice recording transfer function |
DE602004010363T2 (en) * | 2003-01-17 | 2008-10-23 | Nippon Telegraph And Telephone Corp. | REMOTE VIDEO DISPLAY METHOD, VIDEO RECORDING DEVICE, METHOD AND PROGRAM THEREFOR |
JP3937332B2 (en) * | 2003-03-11 | 2007-06-27 | ソニー株式会社 | Shooting system |
US7636132B2 (en) * | 2003-04-17 | 2009-12-22 | Sharp Kabushiki Kaisha | Transmitter, receiver, wireless system, control method, control program, and computer-readable recording medium containing the program |
JP4064877B2 (en) * | 2003-06-30 | 2008-03-19 | 株式会社東芝 | Digital camera with multiple display screens |
US20050048918A1 (en) | 2003-08-29 | 2005-03-03 | Onami, Llc | Radio controller system and method for remote devices |
US7209176B2 (en) | 2004-02-19 | 2007-04-24 | Chapman/Leonard Studio Equipment | Three-axis remote camera head |
US7643066B2 (en) * | 2004-02-19 | 2010-01-05 | Robert Bosch Gmbh | Method and apparatus for producing frame accurate position data in a PTZ dome camera with open loop control |
US7103698B2 (en) * | 2004-09-29 | 2006-09-05 | Hewlett-Packard Development Company, L.P. | Docking alignment sensor |
US7835505B2 (en) * | 2005-05-13 | 2010-11-16 | Microsoft Corporation | Phone-to-monitor connection device |
EP1727363B1 (en) * | 2005-05-23 | 2018-03-28 | Sony Mobile Communications, Inc. | Electronic equipment for a communication system |
US7379664B2 (en) * | 2005-07-26 | 2008-05-27 | Tinkers & Chance | Remote view and controller for a camera |
US7688203B2 (en) | 2006-01-12 | 2010-03-30 | Alfred Gerhold Rockefeller | Surveillance device by use of digital cameras linked to a cellular or wireless telephone |
US7427926B2 (en) * | 2006-01-26 | 2008-09-23 | Microsoft Corporation | Establishing communication between computing-based devices through motion detection |
EP2651147A1 (en) * | 2006-01-26 | 2013-10-16 | NEC Corporation | Electronic device and acoustic playback method |
US20070254640A1 (en) * | 2006-04-27 | 2007-11-01 | Bliss Stephen J | Remote control and viewfinder for mobile camera phone |
US7893990B1 (en) * | 2006-07-31 | 2011-02-22 | Cisco Technology, Inc. | Digital video camera with retractable data connector and resident software application |
ES2570191T3 (en) * | 2006-09-05 | 2016-05-17 | Alcatel Lucent | Smart device and user terminal that has a short-range interface and long-range interface |
JP5216204B2 (en) * | 2006-10-31 | 2013-06-19 | 株式会社半導体エネルギー研究所 | Liquid crystal display device and manufacturing method thereof |
US20080147730A1 (en) * | 2006-12-18 | 2008-06-19 | Motorola, Inc. | Method and system for providing location-specific image information |
JP4112596B1 (en) * | 2006-12-18 | 2008-07-02 | シャープ株式会社 | Liquid crystal display device, portable information terminal device, viewing angle control method, control program, and recording medium |
KR101172402B1 (en) * | 2007-02-14 | 2012-08-08 | 삼성전자주식회사 | Method for watching and hearing TV program in connection between mobile phone and digital TV, mobile phone and digital TV thereof |
JP2008211379A (en) * | 2007-02-23 | 2008-09-11 | Fujitsu Ltd | Display control program and portable terminal device |
JP4596280B2 (en) * | 2007-12-14 | 2010-12-08 | ソニー株式会社 | Video output device and method for forcibly displaying menu screen of video output device |
JP2009147828A (en) * | 2007-12-17 | 2009-07-02 | Sony Ericsson Mobilecommunications Japan Inc | Mobile terminal device, program, method and system for exchanging personal information |
TW200928568A (en) | 2007-12-18 | 2009-07-01 | Inventec Appliances Corp | Interaction system for mobile phone and camera and method thereof |
US20090251422A1 (en) * | 2008-04-08 | 2009-10-08 | Honeywell International Inc. | Method and system for enhancing interaction of a virtual keyboard provided through a small touch screen |
JP5183326B2 (en) * | 2008-06-30 | 2013-04-17 | キヤノン株式会社 | Communication apparatus and computer program |
JP5120269B2 (en) * | 2009-01-14 | 2013-01-16 | 富士通モバイルコミュニケーションズ株式会社 | Electronics |
US8264529B2 (en) | 2009-02-25 | 2012-09-11 | T-Mobile Usa, Inc. | Camera pod that captures images or video when triggered by a mobile device |
KR101511193B1 (en) | 2009-02-27 | 2015-04-10 | 파운데이션 프로덕션, 엘엘씨 | Headset-based telecommunications platform |
DE102009011270B4 (en) * | 2009-03-05 | 2012-01-19 | Intravis Gmbh | Method and device for quality assurance of at least partially transparent hollow bodies |
JP5340017B2 (en) * | 2009-04-28 | 2013-11-13 | 三洋電機株式会社 | Built-in battery and charging stand |
JP5323579B2 (en) * | 2009-05-01 | 2013-10-23 | 三洋電機株式会社 | Image capturing apparatus and video processing system including the same |
US8675084B2 (en) | 2009-09-04 | 2014-03-18 | Apple Inc. | Systems and methods for remote camera control |
US8675071B1 (en) * | 2009-10-02 | 2014-03-18 | Alarm.Com Incorporated | Video monitoring and alarm verification technology |
US8348527B2 (en) * | 2009-10-19 | 2013-01-08 | Goldberg Jeffrey M | Camera holder and shutter activation device |
CA2778284C (en) * | 2009-10-20 | 2018-04-24 | The Regents Of The University Of California | Incoherent lensfree cell holography and microscopy on a chip |
JP5488056B2 (en) * | 2010-03-03 | 2014-05-14 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
JP4885323B2 (en) * | 2010-05-19 | 2012-02-29 | シャープ株式会社 | Source device, sink device, system, program, and recording medium |
JP2012049825A (en) * | 2010-08-26 | 2012-03-08 | Canon Inc | Communication apparatus |
KR101763595B1 (en) * | 2010-11-16 | 2017-08-01 | 엘지전자 주식회사 | Method for processing data for monitoring service in network tv and the network tv |
-
2010
- 2010-12-31 US US12/983,168 patent/US20110098083A1/en not_active Abandoned
-
2017
- 2017-12-08 US US15/836,815 patent/US10331024B2/en not_active Expired - Fee Related
Patent Citations (81)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4782402A (en) * | 1986-03-14 | 1988-11-01 | Pioneer Electronic Corporation | Video disk with multiplexed video and digital information |
US4866430A (en) * | 1986-12-11 | 1989-09-12 | Motorola, Inc. | Low voltage LED driver circuit |
US5257324A (en) * | 1991-11-01 | 1993-10-26 | The United States Of America As Represented By The Secretary Of The Navy | Zero-time-delay video processor circuit |
US5343243A (en) * | 1992-01-07 | 1994-08-30 | Ricoh Company, Ltd. | Digital video camera |
US6043310A (en) * | 1994-05-31 | 2000-03-28 | General Electric Company | Thin-wall housing |
US5680649A (en) * | 1994-06-29 | 1997-10-21 | Seiko Precision Inc. | Lens driving device for auto-focus camera |
US5646679A (en) * | 1994-06-30 | 1997-07-08 | Canon Kabushiki Kaisha | Image combining method and apparatus |
US7167155B1 (en) * | 1995-07-20 | 2007-01-23 | E Ink Corporation | Color electrophoretic displays |
US5568192A (en) * | 1995-08-30 | 1996-10-22 | Intel Corporation | Method and apparatus for processing digital video camera signals |
US6445833B1 (en) * | 1996-07-18 | 2002-09-03 | Sanyo Electric Co., Ltd | Device and method for converting two-dimensional video into three-dimensional video |
US7136633B2 (en) * | 1996-10-08 | 2006-11-14 | Cingular Wireless Ii, Llc | Method and apparatus for over-the-air programming of telecommunication services |
US6282320B1 (en) * | 1997-03-19 | 2001-08-28 | Sony Corporation | Video data decoding apparatus and method and video signal reproduction apparatus and method |
US6992733B1 (en) * | 1997-04-11 | 2006-01-31 | Micron Technology, Inc. | Backlighting system for an LCD |
US20040257436A1 (en) * | 1997-04-21 | 2004-12-23 | Sony Corporation | Controller for photographing apparatus and photographing system |
US5947359A (en) * | 1997-06-25 | 1999-09-07 | Kojima Press Industry Co., Ltd. | Device for holding a portable telephone |
US7345934B2 (en) * | 1997-08-07 | 2008-03-18 | Sandisk Corporation | Multi-state memory |
US6323858B1 (en) * | 1998-05-13 | 2001-11-27 | Imove Inc. | System for digitally capturing and recording panoramic movies |
US6178144B1 (en) * | 1998-06-02 | 2001-01-23 | Seagate Technology Llc | Magneto-optical recording system employing linear recording and playback channels |
US7102686B1 (en) * | 1998-06-05 | 2006-09-05 | Fuji Photo Film Co., Ltd. | Image-capturing apparatus having multiple image capturing units |
US6696198B2 (en) * | 1998-12-10 | 2004-02-24 | Matsushita Electric Industrial Co., Ltd. | Flat battery |
US7149178B2 (en) * | 1999-02-18 | 2006-12-12 | Lsi Logic Corporation | Method and format for reading and writing in a multilevel optical data systems |
US6975308B1 (en) * | 1999-04-30 | 2005-12-13 | Bitetto Frank W | Digital picture display frame |
US6574417B1 (en) * | 1999-08-20 | 2003-06-03 | Thomson Licensing S.A. | Digital video processing and interface system for video, audio and ancillary data |
US7123745B1 (en) * | 1999-11-24 | 2006-10-17 | Koninklijke Philips Electronics N.V. | Method and apparatus for detecting moving objects in video conferencing and other applications |
US6972796B2 (en) * | 2000-02-29 | 2005-12-06 | Matsushita Electric Industrial Co., Ltd. | Image pickup system and vehicle-mounted-type sensor system |
US7389580B2 (en) * | 2000-03-24 | 2008-06-24 | Cymbet Corporation | Method and apparatus for thin-film battery having ultra-thin electrolyte |
US6669346B2 (en) * | 2000-05-15 | 2003-12-30 | Darrell J. Metcalf | Large-audience, positionable imaging and display system for exhibiting panoramic imagery, and multimedia content featuring a circularity of action |
US6727941B1 (en) * | 2000-08-18 | 2004-04-27 | The United States Of America As Represented By The Secretary Of The Navy | Universal digital camera controller with automatic iris tuning |
US7688306B2 (en) * | 2000-10-02 | 2010-03-30 | Apple Inc. | Methods and apparatuses for operating a portable device based on an accelerometer |
US7020888B2 (en) * | 2000-11-27 | 2006-03-28 | Intellocity Usa, Inc. | System and method for providing an omnimedia package |
US7424175B2 (en) * | 2001-03-23 | 2008-09-09 | Objectvideo, Inc. | Video segmentation using statistical pixel modeling |
US6961055B2 (en) * | 2001-05-09 | 2005-11-01 | Free Radical Design Limited | Methods and apparatus for constructing virtual environments |
US7280745B2 (en) * | 2001-06-15 | 2007-10-09 | Stmicroelectronics Sa | Process and device for managing the memory space of a hard disk, in particular for a receiver of satellite digital television signals |
US6885374B2 (en) * | 2001-06-29 | 2005-04-26 | Intel Corporation | Apparatus, method and system with a graphics-rendering engine having a time allocator |
US7616248B2 (en) * | 2001-07-17 | 2009-11-10 | Eastman Kodak Company | Revised recapture camera and method |
US7340077B2 (en) * | 2002-02-15 | 2008-03-04 | Canesta, Inc. | Gesture recognition system using depth perceptive sensors |
US7126897B2 (en) * | 2002-03-18 | 2006-10-24 | Ricoh Company, Ltd. | Multi-level information recording apparatus, multi-level information recording method, multi-level information recording medium and multi-level information recording-reproducing apparatus |
US7085484B2 (en) * | 2002-07-17 | 2006-08-01 | Minolta Co., Ltd. | Driving device, position controller provided with driving device, and camera provided with position controller |
US7227893B1 (en) * | 2002-08-22 | 2007-06-05 | Xlabs Holdings, Llc | Application-specific object-based segmentation and recognition system |
US7259792B2 (en) * | 2002-08-29 | 2007-08-21 | Sony Corporation | Optical system controller for video camera |
US7365789B2 (en) * | 2002-09-03 | 2008-04-29 | Canon Kabushiki Kaisha | Autofocus method and apparatus |
US7477919B2 (en) * | 2002-09-19 | 2009-01-13 | Peter Warren | Handheld input/output device providing enhanced user interface for a mobile telephone |
US20060061951A1 (en) * | 2003-05-29 | 2006-03-23 | Fujitsu Limited | Method for controlling computer system having wireless display and computer system |
US20050122400A1 (en) * | 2003-08-13 | 2005-06-09 | Topcon Corporation | Photographic apparatus with function of image correction and method thereof |
US20070247697A1 (en) * | 2003-11-05 | 2007-10-25 | E Ink Corporation | Electro-optic displays, and materials for use therein |
US7355444B2 (en) * | 2004-02-25 | 2008-04-08 | Ternarylogic Llc | Single and composite binary and multi-valued logic functions from gates and inverters |
US20080111583A1 (en) * | 2004-02-25 | 2008-05-15 | Peter Lablans | Implementing logic functions with non-magnitude based physical phenomena |
US20080180987A1 (en) * | 2004-02-25 | 2008-07-31 | Peter Lablans | Multi-State Latches From n-State Reversible Inverters |
US7365787B2 (en) * | 2004-02-26 | 2008-04-29 | Research In Motion Limited | Mobile device with integrated camera operations |
US20080024594A1 (en) * | 2004-05-19 | 2008-01-31 | Ritchey Kurtis J | Panoramic image-based virtual reality/telepresence audio-visual system and method |
US7215144B2 (en) * | 2004-05-20 | 2007-05-08 | International Business Machines Corporation | Pre-emphasis driver with constant impedance |
US7397690B2 (en) * | 2004-06-01 | 2008-07-08 | Temarylogic Llc | Multi-valued digital information retaining elements and memory devices |
US20060164883A1 (en) * | 2005-01-25 | 2006-07-27 | Peter Lablans | Multi-valued scrambling and descrambling of digital data on optical disks and other storage media |
US7418275B2 (en) * | 2005-04-22 | 2008-08-26 | Su-Ching Yiu | Mobile communication apparatus with rotatable display screen |
US7667765B2 (en) * | 2005-06-17 | 2010-02-23 | Hewlett-Packard Development Company, L.P. | Digital imaging device shutter calibration method and apparatus using multiple exposures of a single field |
US20070010200A1 (en) * | 2005-07-07 | 2007-01-11 | Kabushiki Kaisha Toshiba | Wireless communication device |
US7676150B2 (en) * | 2005-07-11 | 2010-03-09 | Fujifilm Corporation | Image pickup apparatus, image pickup method and image pickup program |
US7741734B2 (en) * | 2005-07-12 | 2010-06-22 | Massachusetts Institute Of Technology | Wireless non-radiative energy transfer |
US20070031062A1 (en) * | 2005-08-04 | 2007-02-08 | Microsoft Corporation | Video registration and image sequence stitching |
US20070035516A1 (en) * | 2005-08-09 | 2007-02-15 | Delphi Technologies, Inc. | Joystick sensor with light detection |
US20070120979A1 (en) * | 2005-11-21 | 2007-05-31 | Microsoft Corporation | Combined digital and mechanical tracking of a person or object using a single video camera |
US7911493B2 (en) * | 2005-12-16 | 2011-03-22 | Kyocera Corporation | Display accessory for a mobile device |
US7429988B2 (en) * | 2006-03-06 | 2008-09-30 | At&T Intellectual Property I, L.P. | Methods and apparatus for convenient change of display characters on a handheld device |
US7835736B2 (en) * | 2006-04-03 | 2010-11-16 | Disney Enterprises, Inc. | System and method for initializing a portable communication device within a group at a point of activation |
US20070248260A1 (en) * | 2006-04-20 | 2007-10-25 | Nokia Corporation | Supporting a 3D presentation |
US20080002023A1 (en) * | 2006-06-30 | 2008-01-03 | Microsoft Corporation Microsoft Patent Group | Parametric calibration for panoramic camera systems |
US20080024614A1 (en) * | 2006-07-25 | 2008-01-31 | Hsiang-Tsun Li | Mobile device with dual digital camera sensors and methods of using the same |
US20080024596A1 (en) * | 2006-07-25 | 2008-01-31 | Hsiang-Tsun Li | Stereo image and video capturing device with dual digital sensors and methods of using the same |
US20080106634A1 (en) * | 2006-11-07 | 2008-05-08 | Fujifilm Corporation | Multiple lens imaging apparatuses, and methods and programs for setting exposure of multiple lens imaging apparatuses |
US20080239170A1 (en) * | 2007-03-26 | 2008-10-02 | Motorola, Inc. | Low profile, high efficiency projector for handheld electronic devices |
US20100128015A1 (en) * | 2007-05-24 | 2010-05-27 | Liquavista B.V. | Electrowetting element, display device and control system |
US8311513B1 (en) * | 2007-06-27 | 2012-11-13 | ENORCOM Corporation | Automated mobile system |
US20090086015A1 (en) * | 2007-07-31 | 2009-04-02 | Kongsberg Defence & Aerospace As | Situational awareness observation apparatus |
US7869117B2 (en) * | 2007-08-01 | 2011-01-11 | Samsung Electronics Co., Ltd. | Electrophoretic display device |
US20090175323A1 (en) * | 2008-01-08 | 2009-07-09 | Qualcomm Incorporated | Methods and Devices for Wireless Chip-to-Chip Communications |
US8180410B2 (en) * | 2008-06-06 | 2012-05-15 | Sandisk Technologies Inc. | Housing and clip assembly for portable electronics device |
US20090309177A1 (en) * | 2008-06-17 | 2009-12-17 | Samsung Electro-Mechanics Co., Ltd. | Wafer level camera module and method of manufacturing the same |
US20100050779A1 (en) * | 2008-08-29 | 2010-03-04 | Pushparaj Victor L | Carbon nanotube-based load cells |
US20100171708A1 (en) * | 2009-01-08 | 2010-07-08 | Prime View International Co., Ltd. | Touch-control structure for a flexible display device |
US20110095999A1 (en) * | 2009-10-23 | 2011-04-28 | Plastic Logic Limited | Electronic document reading devices |
US8619021B2 (en) * | 2009-10-23 | 2013-12-31 | Plastic Logic Limited | Electronic document reading devices |
Cited By (106)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10500741B2 (en) | 2008-12-30 | 2019-12-10 | May Patents Ltd. | Electric shaver with imaging capability |
US10456933B2 (en) | 2008-12-30 | 2019-10-29 | May Patents Ltd. | Electric shaver with imaging capability |
US11206342B2 (en) | 2008-12-30 | 2021-12-21 | May Patents Ltd. | Electric shaver with imaging capability |
US11297216B2 (en) | 2008-12-30 | 2022-04-05 | May Patents Ltd. | Electric shaver with imaging capabtility |
US20120320180A1 (en) * | 2008-12-30 | 2012-12-20 | May Patents Ltd. | Electric shaver with imaging capability |
US11303792B2 (en) | 2008-12-30 | 2022-04-12 | May Patents Ltd. | Electric shaver with imaging capability |
US11336809B2 (en) | 2008-12-30 | 2022-05-17 | May Patents Ltd. | Electric shaver with imaging capability |
US11006029B2 (en) | 2008-12-30 | 2021-05-11 | May Patents Ltd. | Electric shaver with imaging capability |
US10999484B2 (en) | 2008-12-30 | 2021-05-04 | May Patents Ltd. | Electric shaver with imaging capability |
US10986259B2 (en) | 2008-12-30 | 2021-04-20 | May Patents Ltd. | Electric shaver with imaging capability |
US11356588B2 (en) | 2008-12-30 | 2022-06-07 | May Patents Ltd. | Electric shaver with imaging capability |
US10958819B2 (en) | 2008-12-30 | 2021-03-23 | May Patents Ltd. | Electric shaver with imaging capability |
US11616898B2 (en) | 2008-12-30 | 2023-03-28 | May Patents Ltd. | Oral hygiene device with wireless connectivity |
US10863071B2 (en) | 2008-12-30 | 2020-12-08 | May Patents Ltd. | Electric shaver with imaging capability |
US10730196B2 (en) | 2008-12-30 | 2020-08-04 | May Patents Ltd. | Electric shaver with imaging capability |
US10695922B2 (en) | 2008-12-30 | 2020-06-30 | May Patents Ltd. | Electric shaver with imaging capability |
US10661458B2 (en) | 2008-12-30 | 2020-05-26 | May Patents Ltd. | Electric shaver with imaging capability |
US11438495B2 (en) | 2008-12-30 | 2022-09-06 | May Patents Ltd. | Electric shaver with imaging capability |
US11206343B2 (en) | 2008-12-30 | 2021-12-21 | May Patents Ltd. | Electric shaver with imaging capability |
US11445100B2 (en) | 2008-12-30 | 2022-09-13 | May Patents Ltd. | Electric shaver with imaging capability |
US10456934B2 (en) | 2008-12-30 | 2019-10-29 | May Patents Ltd. | Electric hygiene device with imaging capability |
US10449681B2 (en) | 2008-12-30 | 2019-10-22 | May Patents Ltd. | Electric shaver with imaging capability |
US11509808B2 (en) | 2008-12-30 | 2022-11-22 | May Patents Ltd. | Electric shaver with imaging capability |
US11563878B2 (en) | 2008-12-30 | 2023-01-24 | May Patents Ltd. | Method for non-visible spectrum images capturing and manipulating thereof |
US9174351B2 (en) * | 2008-12-30 | 2015-11-03 | May Patents Ltd. | Electric shaver with imaging capability |
US20160001453A1 (en) * | 2008-12-30 | 2016-01-07 | May Patents Ltd. | Electric shaver with imaging capability |
US11570347B2 (en) | 2008-12-30 | 2023-01-31 | May Patents Ltd. | Non-visible spectrum line-powered camera |
US11838607B2 (en) | 2008-12-30 | 2023-12-05 | May Patents Ltd. | Electric shaver with imaging capability |
US11575817B2 (en) | 2008-12-30 | 2023-02-07 | May Patents Ltd. | Electric shaver with imaging capability |
US11800207B2 (en) | 2008-12-30 | 2023-10-24 | May Patents Ltd. | Electric shaver with imaging capability |
US11778290B2 (en) | 2008-12-30 | 2023-10-03 | May Patents Ltd. | Electric shaver with imaging capability |
US11758249B2 (en) | 2008-12-30 | 2023-09-12 | May Patents Ltd. | Electric shaver with imaging capability |
US11575818B2 (en) | 2008-12-30 | 2023-02-07 | May Patents Ltd. | Electric shaver with imaging capability |
US11716523B2 (en) | 2008-12-30 | 2023-08-01 | Volteon Llc | Electric shaver with imaging capability |
US9950434B2 (en) | 2008-12-30 | 2018-04-24 | May Patents Ltd. | Electric shaver with imaging capability |
US9950435B2 (en) * | 2008-12-30 | 2018-04-24 | May Patents Ltd. | Electric shaver with imaging capability |
US10220529B2 (en) | 2008-12-30 | 2019-03-05 | May Patents Ltd. | Electric hygiene device with imaging capability |
US11303791B2 (en) | 2008-12-30 | 2022-04-12 | May Patents Ltd. | Electric shaver with imaging capability |
US10868948B2 (en) | 2008-12-30 | 2020-12-15 | May Patents Ltd. | Electric shaver with imaging capability |
US9848174B2 (en) | 2008-12-30 | 2017-12-19 | May Patents Ltd. | Electric shaver with imaging capability |
US8564566B2 (en) * | 2010-06-11 | 2013-10-22 | Fih (Hong Kong) Limited | Input device using projector |
US20110304580A1 (en) * | 2010-06-11 | 2011-12-15 | Fih (Hong Kong) Limited | Input device using projector |
US9497391B2 (en) * | 2010-10-05 | 2016-11-15 | Sony Corporation | Apparatus and method for displaying images |
US20150249794A1 (en) * | 2010-10-05 | 2015-09-03 | Sony Computer Entertainment Inc. | Apparatus and method for displaying images |
US20120115543A1 (en) * | 2010-11-05 | 2012-05-10 | Hon Hai Precision Industry Co., Ltd. | Head mounted display apparatus with phone function |
US20120194636A1 (en) * | 2011-01-31 | 2012-08-02 | Sony Corporation | Information processing apparatus, information processing method, program, and imaging apparatus |
CN102625023A (en) * | 2011-01-31 | 2012-08-01 | 索尼公司 | Information processing apparatus, information processing method, program, and imaging apparatus |
US9713764B2 (en) * | 2011-08-04 | 2017-07-25 | Zvi Minkovitch | Method, system and apparatus for managing a football match |
US20140179384A1 (en) * | 2011-08-04 | 2014-06-26 | Zvi Minkovitch | Method, system and apparatus for managing a football match |
US20150181197A1 (en) * | 2011-10-05 | 2015-06-25 | Amazon Technologies, Inc. | Stereo imaging using disparate imaging devices |
US9325968B2 (en) * | 2011-10-05 | 2016-04-26 | Amazon Technologies, Inc. | Stereo imaging using disparate imaging devices |
US20150201193A1 (en) * | 2012-01-10 | 2015-07-16 | Google Inc. | Encoding and decoding techniques for remote screen sharing of media content using video source and display parameters |
US8792189B2 (en) | 2012-01-26 | 2014-07-29 | Bibek Sahu | Personal media viewing device |
US11678804B2 (en) | 2012-03-07 | 2023-06-20 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
US20130237811A1 (en) * | 2012-03-07 | 2013-09-12 | Speir Technologies Inc. | Methods and systems for tracking and guiding sensors and instruments |
CN104271046A (en) * | 2012-03-07 | 2015-01-07 | 齐特奥股份有限公司 | Methods and systems for tracking and guiding sensors and instruments |
US10426350B2 (en) | 2012-03-07 | 2019-10-01 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
US9561019B2 (en) * | 2012-03-07 | 2017-02-07 | Ziteo, Inc. | Methods and systems for tracking and guiding sensors and instruments |
US9013542B2 (en) * | 2012-03-26 | 2015-04-21 | Altek Corporation | Image capture device and image synthesis method thereof |
US20130250041A1 (en) * | 2012-03-26 | 2013-09-26 | Altek Corporation | Image capture device and image synthesis method thereof |
US20140098193A1 (en) * | 2012-10-05 | 2014-04-10 | Research In Motion Limited | Methods and devices for generating a stereoscopic image |
US9137517B2 (en) * | 2012-10-05 | 2015-09-15 | Blackberry Limited | Methods and devices for generating a stereoscopic image |
US9148651B2 (en) * | 2012-10-05 | 2015-09-29 | Blackberry Limited | Methods and devices for generating a stereoscopic image |
US20140098197A1 (en) * | 2012-10-05 | 2014-04-10 | Research In Motion Limited | Methods and devices for generating a stereoscopic image |
US20140096642A1 (en) * | 2012-10-05 | 2014-04-10 | Remy Technologies, Llc | Starter motor |
US20180020158A1 (en) * | 2013-03-14 | 2018-01-18 | Futurewei Technologies, Inc. | Camera Augmented Reality Based Activity History Tracking |
US20170034433A1 (en) * | 2013-03-14 | 2017-02-02 | Futurewei Technologies, Inc. | Camera Augmented Reality Based Activity History Tracking |
US10129462B2 (en) * | 2013-03-14 | 2018-11-13 | Futurewei Technologies, Inc. | Camera augmented reality based activity history tracking |
US10212337B2 (en) * | 2013-03-14 | 2019-02-19 | Futurewei Technologies, Inc. | Camera augmented reality based activity history tracking |
US10967240B2 (en) | 2013-03-18 | 2021-04-06 | Fb-Mm Ltd. | Sports match refereeing system |
US9889367B2 (en) | 2013-03-18 | 2018-02-13 | Zvi Minkovitch | Sports match refereeing system |
US10596444B2 (en) | 2013-03-18 | 2020-03-24 | Fb-Mm Ltd. | Sports match refereeing system |
DE102013006330A1 (en) * | 2013-04-12 | 2014-10-16 | GM Global Technology Operations LLC (n. d. Gesetzen des Staates Delaware) | Method and device for determining dimensions of a vehicle |
EP2860961A1 (en) * | 2013-10-11 | 2015-04-15 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
CN104580687A (en) * | 2013-10-11 | 2015-04-29 | Lg电子株式会社 | Mobile terminal and controlling method thereof |
US20150296174A1 (en) * | 2014-03-26 | 2015-10-15 | Bosh Technologies Llc | Modular display system |
WO2016025962A1 (en) * | 2014-08-15 | 2016-02-18 | The University Of Akron | Device and method for three-dimensional video communication |
US20170295357A1 (en) * | 2014-08-15 | 2017-10-12 | The University Of Akron | Device and method for three-dimensional video communication |
US10171792B2 (en) * | 2014-08-15 | 2019-01-01 | The University Of Akron | Device and method for three-dimensional video communication |
US11464503B2 (en) | 2014-11-14 | 2022-10-11 | Ziteo, Inc. | Methods and systems for localization of targets inside a body |
US10617401B2 (en) | 2014-11-14 | 2020-04-14 | Ziteo, Inc. | Systems for localization of targets inside a body |
US10157328B2 (en) * | 2014-11-20 | 2018-12-18 | Idemia France | Method and device for authentification of a user |
US20160148075A1 (en) * | 2014-11-20 | 2016-05-26 | Oberthur Technologies | Method and device for authentification of a user |
US11019323B2 (en) * | 2015-02-06 | 2021-05-25 | Tara Chand Singhal | Apparatus and method for 3D like camera system in a handheld mobile wireless device |
US20160234477A1 (en) * | 2015-02-06 | 2016-08-11 | Tara Chand Singhal | Apparatus and method for 3d like camera system in a handheld mobile wireless device |
US20160295121A1 (en) * | 2015-04-03 | 2016-10-06 | Canon Kabushiki Kaisha | Accessory for camera and storage medium storing display control program |
US10230896B2 (en) * | 2015-04-03 | 2019-03-12 | Canon Kabushiki Kaisha | Accessory for camera and storage medium storing display control program |
US20170011732A1 (en) * | 2015-07-07 | 2017-01-12 | Aumed Corporation | Low-vision reading vision assisting system based on ocr and tts |
US10466947B2 (en) | 2015-11-16 | 2019-11-05 | Goertek Inc. | Method and device for black and white screen display based on android platform, and smart terminal |
CN105468320A (en) * | 2015-11-16 | 2016-04-06 | 歌尔声学股份有限公司 | Black and white screen display method and device based on Android platform and intelligent terminal |
US20170353636A1 (en) * | 2016-06-03 | 2017-12-07 | Ancil Bishop | Tri-lens image capturing device |
US10334203B2 (en) * | 2016-11-07 | 2019-06-25 | Hong Fu Jin Precision Industry (Shenzhen) Co., Ltd. | Thin-profile television device |
CN110196615A (en) * | 2018-02-26 | 2019-09-03 | 广东欧珀移动通信有限公司 | A kind of control method of functional module group, electronic equipment and readable storage medium storing program for executing |
US11439358B2 (en) | 2019-04-09 | 2022-09-13 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
US11883214B2 (en) | 2019-04-09 | 2024-01-30 | Ziteo, Inc. | Methods and systems for high performance and versatile molecular imaging |
US11563882B2 (en) | 2019-08-30 | 2023-01-24 | Puwell Technology Llc | Method and system for control of a digital camera system |
US11070712B2 (en) * | 2019-08-30 | 2021-07-20 | Puwell Technology Llc | Method and system for control of a digital camera system |
WO2021052252A1 (en) * | 2019-09-17 | 2021-03-25 | 深圳市万普拉斯科技有限公司 | Camera raising and retracting control method and apparatus, and mobile terminal |
US20210321077A1 (en) * | 2020-01-09 | 2021-10-14 | JUC Holdings Limited | 2d digital image capture system and simulating 3d digital image sequence |
US20210289190A1 (en) * | 2020-01-09 | 2021-09-16 | Jerry Nims | 2d image capture system & display of 3d digital image |
US20210297647A1 (en) * | 2020-01-09 | 2021-09-23 | Jerry Nims | 2d image capture system, transmission & display of 3d digital image |
US20220078392A1 (en) * | 2020-01-09 | 2022-03-10 | JUC Holdings Limited | 2d digital image capture system, frame speed, and simulating 3d digital image sequence |
US20220051427A1 (en) * | 2020-01-09 | 2022-02-17 | JUC Holdings Limited | Subsurface imaging and display of 3d digital image and 3d image sequence |
US11917119B2 (en) * | 2020-01-09 | 2024-02-27 | Jerry Nims | 2D image capture system and display of 3D digital image |
US20220385807A1 (en) * | 2021-05-28 | 2022-12-01 | Jerry Nims | 2d digital image capture system and simulating 3d digital image and sequence |
US20220385880A1 (en) * | 2021-05-28 | 2022-12-01 | Jerry Nims | 2d digital image capture system and simulating 3d digital image and sequence |
Also Published As
Publication number | Publication date |
---|---|
US20180095357A1 (en) | 2018-04-05 |
US10331024B2 (en) | 2019-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10331024B2 (en) | Mobile and portable screen to view an image recorded by a camera | |
US9171221B2 (en) | Camera to track an object | |
US10831093B1 (en) | Focus control for a plurality of cameras in a smartphone | |
US10261408B2 (en) | Mobile and portable camera platform for tracking an object | |
US11606482B2 (en) | Methods for camera movement compensation | |
US11659074B2 (en) | Information processing terminal | |
US9007431B1 (en) | Enabling the integration of a three hundred and sixty degree panoramic camera within a consumer device case | |
US9851803B2 (en) | Autonomous computing and telecommunications head-up displays glasses | |
US9253397B2 (en) | Array camera, mobile terminal, and methods for operating the same | |
US9618747B2 (en) | Head mounted display for viewing and creating a media file including omnidirectional image data and corresponding audio data | |
US20100194860A1 (en) | Method of stereoscopic 3d image capture using a mobile device, cradle or dongle | |
US10218942B2 (en) | Positionable camera | |
JP2013519910A (en) | Accessories for personal communication devices | |
US10277789B2 (en) | Camera device | |
EP2717563B1 (en) | Image processing device | |
US20040189850A1 (en) | Digital camera with detachable display module | |
US20220385882A1 (en) | Information processing apparatus, control method, and non-transitory computer readable medium | |
US9811160B2 (en) | Mobile terminal and method for controlling the same | |
US9936111B1 (en) | Articulating imaging module for electronic devices | |
CN220570624U (en) | Image capturing system and image capturing apparatus | |
US20240053660A1 (en) | Interconnect mechanism for image capture device | |
US20220385830A1 (en) | Electronic apparatus, control method, and non-transitory computer readable medium | |
WO2016085669A1 (en) | A modular image capture device | |
KR20160149945A (en) | Mobile terminal | |
JP5450864B1 (en) | Combined system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |