US20120062691A1 - Camera Control - Google Patents
Camera Control Download PDFInfo
- Publication number
- US20120062691A1 US20120062691A1 US13/081,354 US201113081354A US2012062691A1 US 20120062691 A1 US20120062691 A1 US 20120062691A1 US 201113081354 A US201113081354 A US 201113081354A US 2012062691 A1 US2012062691 A1 US 2012062691A1
- Authority
- US
- United States
- Prior art keywords
- camera
- image
- images
- computer readable
- processor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16M—FRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
- F16M11/00—Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
- F16M11/02—Heads
- F16M11/04—Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
- F16M11/06—Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
- F16M11/10—Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting around a horizontal axis
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16M—FRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
- F16M11/00—Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
- F16M11/20—Undercarriages with or without wheels
- F16M11/2007—Undercarriages with or without wheels comprising means allowing pivoting adjustment
- F16M11/2014—Undercarriages with or without wheels comprising means allowing pivoting adjustment around a vertical axis
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F16—ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
- F16M—FRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
- F16M11/00—Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
- F16M11/20—Undercarriages with or without wheels
- F16M11/24—Undercarriages with or without wheels changeable in height or length of legs, also for transport only, e.g. by means of tubes screwed into each other
- F16M11/242—Undercarriages with or without wheels changeable in height or length of legs, also for transport only, e.g. by means of tubes screwed into each other by spreading of the legs
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/56—Accessories
- G03B17/561—Support related camera accessories
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B37/00—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
- G03B37/02—Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with scanning movement of lens or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
Definitions
- Digital cameras have proliferated to level unimaginable only a few short years ago. Not only are digital cameras stand-alone devices, they are also features that are added to numerous electronic products. Digital cameras can now be found on, for example, portable electronic devices such as mobile phones, IPODS, and IPHONES. Typically, however, digital cameras found on portable electronic devices suffer from limitations such as limited field of view, limited resolution, and small aperture.
- the embodiments of the invention may provide a system including a camera that includes a lens having an aperture, a tripod configured to support the camera, a mount configured to be coupled the tripod and to the camera, the mount including a pivot that is configured such that the camera is rotatable around an axis that is aligned with the aperture of the camera, a motor configured to control the rotation of the mount, a memory containing computer readable instructions, that when executed by a processor, cause the processor to, calculate a desired position of the camera using sensor information, control the motor such that the camera is moved to the desired position, control the shutter of the camera to capture an image, and store the captured image.
- Implementations of the invention can provide one or more of the following features.
- the processor is included in the camera.
- the motor is configured to control the rotation of the mount around a vertical axis that is aligned with the aperture of the camera.
- the computer readable instructions are further configured to cause the processor to capture a plurality of images that are combinable to create a composite image.
- the composite image is selected from the group consisting of a panoramic image, a cylindrical image, a spherical image, and a high dynamic range image.
- the computer readable instructions are further configured to cause the processor to combine the plurality of images to create a composite image.
- embodiments of the invention may provide a non-transitory computer readable medium including computer readable instructions that, when executed by a processor, are configured to cause the processor to control the operation of a camera, collect information from a sensor in the camera, process the sensor information to calculate a position of the camera, control a motor attached to the camera to position the camera using the calculated position, wherein the motor is configured to rotate the camera around an axis that is aligned with an aperture of the camera, capture an image using the camera, and store the captured image.
- Implementations of the invention can provide one or more of the following features.
- the computer readable instructions are executed in a processor included in the camera. Controlling the motor includes controlling the rotation of the camera around a vertical axis that is aligned with the aperture of the camera.
- the computer readable instructions are further configured to cause the processor to capture a plurality of images that are combinable to create a composite image.
- the computer readable instructions are further configured to cause the processor to create a plurality of images that are combinable into at least one of a panoramic image, a cylindrical image, an spherical image, and a high dynamic range image.
- the computer readable instructions are further configured to cause the processor to combine the plurality of images to create a composite image.
- Multi-picture composite images can be created. Panoramic, cylindrical, and spherical images can be captured. A camera can be automatically controlled and positioned using an image application. A camera can provide information to an image application that is used to control the camera. Images captured by the camera can be stored remotely in the cloud. A user can be provided guidance of how to capture successive images used to create a multi-picture composite image. Images captured by the camera can be backed up by automatically synchronizing uploading the images to a storage location using any available communications wireless network.
- FIG. 1 is a diagram of a system for capturing images.
- FIG. 2 is a diagram of a portion of the system shown in FIG. 1 .
- FIG. 3 is a diagram of a portion of the system shown in FIG. 1 .
- FIG. 4 is a diagram of a system for capturing images.
- FIG. 5 is a diagram of a system for capturing images.
- FIG. 6 is a diagram of a portion of the system shown in FIG. 1 .
- FIGS. 7-11 are exemplary screen shots of a computer application for use with a camera.
- FIG. 12 is a process flow diagram of a process for use with a camera.
- Embodiments of the invention provide techniques for capturing images using a smartphone.
- the smartphone is supported on a mount that is configured such that the smartphone can rotate about several axes that aligned with the focal plane/point of a camera included in the smart phone.
- Some embodiments include a motorized mount that can be configured to automatically rotate the smartphone to capture images in an automated fashion.
- An image capture application can also be provided that is configured to work with and control the mount and to control the smartphone to capture images.
- the image capture application can be configured automatically capture a series of images using the smartphone that can be combined into a single image.
- the image capture application can also be configured to automatically copy media files to a remote storage location.
- the image capture application can also be configured to allow a user to edit and/or manipulate images captured by the smart phone.
- the image capture application can also be configured to register the date, time, location, direction, and camera orientation of every image and/or video that is captured. This information can then be attached to photographs and videos that are captured such that it can easily be organized and searched at a later time. Other embodiments are within the scope of the invention.
- an exemplary system 5 includes a camera 10 , a case 15 , a mount 20 , and a tripod 25 . While FIG. 1 shows an APPLE IPHONE as the camera 10 , other cameras can be used.
- the camera 10 is preferably a smart camera that is configured to take pictures and/or capture video.
- the camera 10 preferably includes a microprocessor on which a third-party application can be executed.
- the camera 10 can be a stand-alone camera or can be part of another device such as, a smart phone, MP3 player, or tablet PC.
- Examples of the camera 10 are APPLE IPHONE, APPLE IPAD, RIM BLACKBERRY, APPLE IPOD, and DROID INCREDIBLE.
- the camera 10 can be configured to download and execute applications provided by third parties (e.g., via an Internet connection).
- the APPLE IPHONE is configured to download applications from an application store.
- the combination of the camera functionality and the ability to execute third-party applications can create novel functionality and enable users to create images of varying quality and type that can be shared online.
- the mount 20 is preferably configured to connect to the camera 10 such that the mount 20 can support and/or maneuver the camera 10 .
- the mount 20 can connect directly to a fitting in the camera 10 , and/or can use an adapter such as a case 15 that is configured to fit the camera 10 .
- the mount 20 includes one or more gimbals that are configured such that it creates two rotational planes that intersect at the aperture and/or focal plane of the lens of the camera 10 . In this configuration, parallax that results from rotation of the camera 10 can be reduced or eliminated.
- the mount 20 can be set on a stable surface and/or can be used in conjunction with a tripod (or other hardware used to eliminate vibrations) to provide manual and/or motorized control of the rotation of the camera 10 .
- the mount 20 can also function as a handle for the camera 10 .
- An exemplary version of the mount 20 includes pivots 50 , 55 , 60 , 65 that are configured to pivot along different axes.
- pivots 50 and 55 are configured to allow the camera 10 to rotate about the y-axis
- the pivot 60 is configured to allow the camera 10 to rotate about the x-axis
- the pivot 65 is configured to allow the camera 10 to rotate about the z-axis.
- the pivots 50 , 55 , 60 , 65 can be configured to be locked in position (e.g., using a lever).
- the pivots 50 , 55 , 60 , 65 can also include one or more mechanical detents at one or more predefined locations (e.g., at 45 and 90 degrees).
- the mount 20 can also include one or more motors 70 that are configured to rotate the camera 10 under the control of a computerized application.
- the motors used to rotate the camera 10 are small motors requiring little electrical power.
- the motor can be configured to rotate the camera 10 on one or more pivot points of the mount 20 (e.g., x, y, and z axes).
- power is provided to the motor by a battery pack 75 , and/or an external power connection.
- Motorized embodiments of the mount 20 can be configured in various ways. For example, there can be a single motor 70 configured to rotate the camera 10 about a single axis or multiple motors 70 can be used to rotate the camera 10 about more than one axis.
- a panorama mount can automate image capture in the horizontal plane while the vertical panning/tilt angle can be adjusted manually to capture a virtual reality or spherical image.
- a virtual reality mount can be motorized on both the vertical and horizontal plane and can fully automate capturing a spherical image.
- the mount 20 can be configured to communicate with one or more other devices.
- the mount 20 can be configured to communicate with the camera 10 , a remote computer, and/or remote control using a wireless (e.g., Infrared, Bluetooth, 802.11, etc) and/or wired connection (e.g., a multi-pin connector).
- the remote control can be, for example, a small traditional handheld unit, or another processor such as a smart phone, PC, and/or tablet operating a control application.
- the communication connections can be used to control the mount 20 (e.g., to control the direction and speed of movement in the vertical and/or horizontal planes) and/or to control the camera 10 itself (e.g., to control the parameters of image capture).
- the camera 10 can also use the connection with the mount 20 to provide information collected by sensors in the camera 10 .
- the camera 10 can provide information captured by onboard sensors (e.g., a gyroscope, clock, GPS receiver, and/or compass).
- onboard sensors e.g., a gyroscope, clock, GPS receiver, and/or compass.
- the mount 20 and/or the camera 10 can be configured to be controlled by a remote control 80 .
- the remote control 80 can be used to remotely control how and when the camera 10 takes a picture.
- the remote control 80 can also be used to control the movement of a motorized version of the mount 20 .
- the remote control 80 can also be used for “follow me” functionality where the mount 20 is configured to automatically point the camera at the remote so that, for example, the person holding the remote control 80 is always in the picture and/or video being captured by the camera 10 .
- the remote control can also be configured to snap to the mount 20 when not in use.
- the mount 20 can include additional features that make control of the camera 10 easier.
- the mount 20 can include a detent on the pivot for the vertical plane that can lock the camera 10 in position to simplify initial setup. This can be used in conjunction with, for example, an electronic version of a bubble level in an image application and can be used to level the mount when adjusting a tripod or other camera stabilization device.
- the mount 20 can also be configured to easily switch the camera 10 between portrait and landscape modes.
- the tripod 25 can be a standard tripod.
- the tripod 25 can include legs 30 that are used to support to stabilize the camera 10 .
- the mount 20 and/or tripod 25 can also be configured to function as a handle when not being used to support the camera 10 .
- One of the three legs of the tripod can be configured as a larger leg such that the other two smaller legs, when folded, fit inside the larger leg (e.g., as shown in FIG. 6 ).
- the tripod 25 can also include a power source used to power the mount 20 (e.g., battery pack).
- the system 5 is preferably configured to work with an image application.
- the image application can be executed by the camera 10 and/or another processor (e.g., a remote computer).
- the user can preferably interface with the image application via a display on the camera 10 and/or via an external display.
- the image application can preferably be used to control the camera 10 and/or the mount 20 .
- the image application can also preferably be used to receive images captured by the camera 10 and to process the same.
- the image application can also be configured to allow user to edit images captured by the camera 10 .
- the image application can be, for example, an application that is configured to run on an APPLE IPHONE.
- the image application can be configured to communicate with network servers via a wired and/or wireless network connection (e.g., a 3G connection provided by a cellular service provider).
- the image application can be configured to process and upload images captured by the camera 10 .
- the image application can be configured to upload images to a remote server in the cloud.
- the image application can also be configured to automatically communicate with a remote network-based application that can process images and can be used for editing by the user.
- the image application can be configured to use information provided by the camera 10 to control the camera 10 .
- the image application can be configured to receive information from sensors contained in the camera 10 to control the camera 10 and/or the mount 20 .
- the image application can be configured to use location information, GPS information, azimuth, orientation, brightness, acceleration, and sound to control the camera 10 .
- the image application can be configured to receive sensor information from the camera 10 that can be used to calculate the position of the camera 10 . Using the position information, the image application can be configured to control the camera 10 to, for example, take a series of consecutive images that can then be stitched together later to create a panoramic image.
- the image application can be configured to control the capture of photos by the camera 10 .
- the image application can be configured to use additional information to control the parameters of the image capture itself (e.g., using brightness information provided by the camera 10 to control shutter speed and aperture of the camera 10 ).
- the focus, aperture, and shutter speed of the camera 10 can be set to custom values based on algorithms that evaluate the image and/or can be based on controls and readings available to the camera 10 .
- the image application can be configured to use many different types of information in order to control the camera 10 .
- the image application can use information provided by the camera 10 (e.g., GPS information, brightness information, field of view, aperture, shutter speed, focus, gyroscope information, orientation information, etc.).
- the image application can be configured to take multiple images that are combined to create a processed picture such as a panoramic, cylindrical, spherical, high dynamic range (HDR), and virtual reality images. This process can be improved by the fact that the mount 20 rotates about axes that are aligned with one or more focal planes of the camera 10 .
- the image application can be configured to process the multiple images on the camera 10 and/or at a remote computer (e.g., a remote webpage or using third-party image editing software).
- Sensor data from the camera 10 can be added to the images (e.g., as metadata) that are collected by the camera 10 order to combine the multiple images, and/or for other processing.
- azimuth and GPS information can be used by mapping software to show the direction the camera was pointed when an image was captured.
- Other metadata can be included as well, such as time, date, latitude, longitude, shutter speed, aperture, white balance, lens information, etc.
- the image application can be configured to capture a panoramic, cylindrical, and/or spherical image by collecting and combining multiple images taken by the camera 10 .
- the image application uses field of view information provided by the camera 10 to determine where to position the camera 10 to collect all of the images that are used to create a completed panoramic, cylindrical, and/or spherical image.
- the image application can be configured to control the mount 20 such that all of the images used to create a final panoramic, cylindrical, and/or spherical image are automatically collected.
- the image application can also be configured to provide feedback to the user of the camera 10 to indicate where to move the camera 10 to capture each of the images used to create a final panoramic, cylindrical, and/or spherical image. For example, audible and/or physical indications can be used to alert the user of the camera 10 where the next image should be taken.
- the image application can be configured to control the camera 10 and/or mount 20 via, for example, a wired and/or wireless connection.
- a wired and/or wireless connection For example, each of the following types of connections can be used: a charging/base connector, a Bluetooth connection, a WiFi connection, an infrared transceiver, a headset jack in the camera 10 (e.g., using the Left/right/common/ground leads), and/or other proprietary connectors.
- the headset jack if present on the camera 10 , can also be connected to an infrared transceiver and used to control the camera 10 and/or mount 20 . Audio signals created by the camera 10 can also be used as a wireless link to control the mount 20 .
- the image application, camera 10 , and/or mount 20 can be configured for “follow-me” operation.
- follow me operation facial recognition can be used to track the location of a person.
- a remote control can be configured to emit a beacon of light that can be used to track the location of the remote control (e.g., keeping the beacon in the center of the frame).
- the light can be at any wavelength that can be detected by sensors used with the image application and/or camera 10 .
- the sensors can include, for example, sensors configured to detect UV and IR bands of electromagnetic radiation.
- the image application can be configured to provide a setup mode.
- the camera 10 is preferably oriented flat and a bubble level utilizing sensors in the camera 10 can be used to level the camera by adjusting, for example, the head of a tripod.
- the image application can be configured to work in various configurations.
- the image application can be executed locally on the camera 10 and use the memory contained in the camera 10 to store images.
- the image application is executed locally on the camera 10 and the user is able to interface with the image application using the camera 10 (e.g., via a touchscreen interface on the camera 10 ).
- the user of the camera 10 can preferably take and edit pictures using the camera 10 , and can preferably store pictures in a memory contained in the camera 10 .
- the image application can be executed locally on the camera 10 and use a remote memory (e.g., in the cloud) to store images.
- image application is preferably executed locally on the camera 10 and the user is able interface with the image application using the camera 10 (e.g., via a touchscreen interface on the camera 10 ).
- the user can preferably use the image application to take and edit pictures using the camera 10 .
- the image application is preferably configured such that the user can store pictures taken and/or edited by the camera 10 in a remotely located memory (e.g., in the cloud).
- the remotely located memory can be a hard drive that is located in a server that is accessible via the Internet.
- the image application can be executed remotely from the camera 10 (e.g., in a server in the cloud) and can use a memory that is located remotely from the camera 10 (e.g., a memory in the cloud) to store images.
- the image application is preferably executed on a remotely located processor.
- the processor can be in physical proximity with the camera 10 and/or accessible over the Internet (e.g., via a webpage).
- the user of the camera 10 can interact with the image application via, for example, a computing device having a network connection capable of communicating with the computer executing the image application.
- the user can cause the camera 10 to take pictures.
- the user can also preferably edit previously taken pictures and store those pictures in a remotely located memory (e.g., in the cloud).
- the image application can be configured to allow users to customize and save a set of standard image profiles. These image profiles can be uploaded to the cloud to enable a user to access the image profiles regardless of the location of the user. Preferably, the image profiles can also be edited online and downloaded (or pushed) to the image application. To the extent that multiple users have created image profiles and save these to a centrally located server, data mining applications in the centrally located server can be configured to use the save profiles to refine the default settings of the image application.
- the following paragraphs describe one exemplary method of operation of the image application.
- the exemplary method of operation may be altered by, for example, removing certain functionality and/or adding additional functionality.
- Exemplary screen shots of the image application are shown in FIGS. 7-11 ).
- the application preferably starts with a screen briefly showing the logo, product name, version information, and, after a brief timeout, starts where the user left off in a project, or at the project list on the studio screen if there is no saved state, or if saved state is unusable.
- a splash screen is not required, and, if used, can vary from that described above.
- An exemplary splash screen is shown in FIG. 2 .
- the Studio screen preferably shows a list of projects (e.g., as shown in FIG. 6 ). If there are no projects, it can show text prompting the user to create a project using a menu on this screen (e.g., as shown in FIG. 8 ). Projects can organize images, how they are processed and combined (transforms), and the outputs or exports of the projects (e.g., jpg files, QuickTime VR, RAW, etc.). Projects are not required to be single documents.
- a project can refer to images that are stored in, for example, the system's image database(s). Moving a project can involve a special type of “export” which is an archive containing project data and copies of images imported and/or acquired as the inputs to the project.
- a project is preferably a collection of images and compound images.
- the outputs, or exports, from a project are images in various formats that are, in general, meant to be HDR images and/or panoramas, though, an output could also be a single, unprocessed image.
- Projects preferably start with either an “import” of images, or by acquiring images (e.g., using the camera 10 ). At any point in a project, additional images can preferably be imported or acquired. After images are imported or acquired, the user can create an output by processing images (e.g., specifying transforms on images) and/or organizing them into composite images.
- Composite images can be designated as outputs or exports by specifying formats for the resulting outputs or exports.
- Composite images can be multiple images from the same perspective combined into a single image, or they can be, for example, multiple images combined, or stitched, into a mosaic, or panorama.
- multiple single images are combined into a panorama or HDR image.
- each image in a panorama can be a composite image.
- Panoramas are typically multiple images in single or multiple rows. If the rows of a panorama are taken from perspectives covering 360 degrees around a single point, a cylindrical panorama can be produced. Spherical panoramas are also possible.
- Transforms can alter an image or compound image, such as a blur filter, perspective shift, etc.
- the available set of transforms should facilitate creating HDR images and panoramas.
- Transforms can operate on whole and/or partial images. Operations can be performed by direct manipulation when possible (e.g., dragging images into groups, and context menus enabling a selection of transforms).
- the project screen can display images and composite images in a way that visually represents the content of those entities (e.g., as shown in FIG. 9 ).
- individual images can be represented by thumbnail images.
- Composite images consisting of images taken from the same perspective can be visually represented by a “stack” of thumbnail images.
- Groups of images comprising a panorama can be shown as thumbnails grouped into the rows of the panorama.
- Setup of the camera 10 can typically be performed at any time up until a shot is started.
- Setup of the camera 10 can include leveling the mount 20 .
- a level indicator with bubble levels can be shown on a screen of the camera 10 , and a tone can be played when the mount is level.
- a warning tone and/or a warning dialog can appear when the user starts a shot. The user can preferably choose to permanently dismiss the warning dialog.
- the first step in shooting is choosing the type of shot.
- Shots can be, for example, single frames with one or more images and/or panoramas with multiple frames, with each frame including one or more images.
- a menu of shot types can be presented, each one a variation on the two basic types. If a previously stored shot type has been chosen, and the user does not want to modify it, the user is ready to start the shot. If the user wants to modify a shot setup or create one from scratch, the user can navigate to the shot setup screen by choosing to modify a shot setup, or to create one.
- the default setups, and any the user creates can be stored in the program data, backed up to a remove server (e.g., in the cloud), and are typically not part of a project. Shot setups can preferably be used with any project.
- the user can preferably edit the programming for a shot.
- the programming that the user can perform can vary for the different types of shots.
- the display can show a viewfinder plus controls for setting up a frame consisting of multiple images to produce HDR, super-resolution, and other multiple image effects from a single perspective.
- the editor can show the position of each frame of a panorama, information about the setup of the individual frames, and controls to set the parameters for the panorama (e.g., cylindrical or less than 360 degree, number of frames, number of rows).
- the shooting screen can be displayed after the shot has been set up.
- the user can preferably configure how the image is captured by the camera. For example, the user can be given the opportunity to release the shutter immediately, after some preset delay, after a specified delay. Other options to capture the image are also possible. For example, the user can capture an image by pressing the call-answer/end-call button on a Bluetooth headset paired with the device containing the camera 10 (e.g., an IPHONE), and/or by using an infrared remote. While a “shutter” has been discussed above, this is not to imply that a physical shutter is required. Other shuttering techniques are possible (e.g., an electronic shutter).
- every image in a project can be edited.
- Editing can specify a set of transforms applied to the image.
- the image application can be configured such that the original image is always retained (e.g., backed up), and transforms can preferably be removed at any later time, reverting the image to its condition before the transform was applied.
- Deleted images are preferably listed in a “trash” list.
- the images in the trash can be permanently deleted in low resource conditions, or explicitly on the user's command.
- the application can be configured to warn users away from performing operations that are not strictly incorrect, but that are unlikely to work or produce a good image. For example, adding images not taken from the same point to a panorama, or not taken from the same perspective to a composite image from which an HDR image can be derived. These warnings should be given throughout the program when the user performs an operation that can be inferred to have an undesirable result, such as deleting an image that was previously selected for saving, and/or creating a shot setup with what could be nonsensical parameters (e.g., too much, or too little overlap, unusual exposure settings, etc.).
- the functionality described herein can also be implemented on a server that is accessible via the Internet, or other network.
- a server can be accessed by an IPHONE over a cellular network.
- the image application can be configured with some differences when implemented as a web-based application.
- differences in functions in a Web version of the image application e.g., relative to a locally hosted version of the application
- Performance requirements and algorithm implementation in web-based versions of the image application can be varied (e.g., as Web servers generally have more computing power than a portable device, more CPU-intense processes can be used).
- the “studio” can relate to image creation.
- One exemplary organizational hierarchy can be: layer>image>canvas>project.
- a layer is preferably a single image.
- An image is preferably a set of layers from a single perspective. Layers in the Image might include differences in exposure, focus, etc.
- the assumption is the layers will be edited and merged down.
- a canvas is preferably a collection of layers and images that are organized. The assumption is that the images will be stitched and/or merged as a part of a larger image.
- a project is preferably a collection of layers, images and canvases that are logically organized based on a project. The user can start by creating a new project or opening an existing one. The user can then make the decision to create an image or canvas. At this point, the difference between an image and a canvas is whether or not the camera will be fixed or moved.
- the user can preferably now select whether to set the focus, aperture, and exposure settings.
- Options can include Manual (e.g., each variable is set manually or using the floating target box and the brackets), Recommend (e.g., the image application analyzes the image and makes recommendations on bracketing and settings), and Automatic (e.g., the user can simply operate the shutter and the recommended settings are put in effect based on algorithms that made decisions relating to bracketing, focus, and exposure).
- This process is preferably the same as creating an image except that the canvas is preferably defined and the brackets are preferably made based on first sampling the canvas.
- the brackets will preferably be set for the composite canvas and applied (if necessary) to the individual images.
- the canvas can be set manually by moving entering “SetUp” and stepping the camera left and right to the end points and clicking on the shutter.
- the canvas can be specified as 360 degree Pano (e.g., the camera 10 preferably steps through a full circle to capture images (e.g., 14 steps)), Custom Pano (e.g., the image application preferably instructs the photographer to use the motor control to move the camera to the left and right limits of the image and then press the shutter button), Spherical (e.g., the camera 10 is preferably able to create three rows of images—one row above the ground plane and one below. Setting these rows will comprise of manually panning the camera, perhaps with an audible detent).
- 360 degree Pano e.g., the camera 10 preferably steps through a full circle to capture images (e.g., 14 steps)
- Custom Pano e.g., the image application preferably instructs the photographer to use the motor control to move the camera to the left and right limits of the image and then press the shutter button
- Spherical e.g., the camera 10 is preferably able to create three rows of images—one row above the ground plane and one below. Setting
- a process 100 for capturing a panoramic image using the system 5 includes the stages shown.
- the process 100 is exemplary only and not limiting.
- the process 100 may be altered, e.g., by having stages added, removed, or rearranged.
- the user creates a new project in the image application.
- the project can be given an arbitrary name and/or assigned a name by the image application.
- the user captures one or more photographs and/or video using the camera 10 . Capture of the photographs and/or video can be done manually and/or under the control of the image application (e.g., being executed in the camera 10 ).
- the display of the camera 10 includes an alignment icon 205 and alignment grid 210 .
- the display of the camera 10 includes an alignment icon 205 and alignment grid 210 .
- one of the cells in the alignment grid 210 preferably turns a different color to indicate that the corresponding image has been captured.
- the user moves the camera 10 to the next location.
- the user is alerted of the location of the next picture by keeping the alignment icon in the center of the screen of the camera 10 .
- the alignment grid 210 can change colors when the camera is properly aligned for the next shot.
- individual cells of the alignment grid 210 change color to indicate a picture for that location has been captured. The user can preferably see, at a glance, which pictures are missing from the pictures needed to create a composite image.
- the camera 10 is configured to calculate the positioning of the camera 10 .
- the calculations can be accomplished using, for example, sensors that are present in the camera 10 , such as, GPS, accelerometer, compass, facial recognition, etc.
- the camera 10 can calculate the various positions needed to produce a final composite image and control the mount 20 accordingly.
- the camera 10 can automatically capture the desired images as the mount 20 moves the camera 10 .
- images captured by the camera 10 are saved to a memory.
- the storage can be present locally in the camera 10 and/or can be located remotely over a network connection.
- Each individual image captured by the camera 10 can be saved as captured, or saved in a bulk manner.
- the image application can be configured to save some pictures in the camera 10 and to save others (e.g. backup copies) in a remote storage location.
- images captured by the camera 10 are processed by the image application.
- the image application merges multiple images captured by the camera 10 in order to create a composite image such as a panoramic, spherical, cylindrical, VR, and/or HDR image. This can be accomplished by, for example, stitching together the multiple images captured by the camera 10 .
- the processing can be performed locally on the camera 10 and/or remotely (e.g., in a remote processor and/or in the cloud).
- the subject matter described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structural means disclosed in this specification and structural equivalents thereof, or in combinations of them.
- the subject matter described herein can be implemented as one or more computer program products, such as one or more computer programs tangibly embodied in an information carrier (e.g., in a machine-readable storage device), or embodied in a propagated signal, for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers).
- a computer program (also known as a program, software, software application, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
- a computer program does not necessarily correspond to a file.
- a program can be stored in a portion of a file that holds other programs or data, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
- a computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
- processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processor of any kind of digital computer.
- a processor will receive instructions and data from a read-only memory or a random access memory or both.
- the essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data.
- a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks.
- Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, (e.g., EPROM, EEPROM, and flash memory devices); magnetic disks, (e.g., internal hard disks or removable disks); magneto-optical disks; and optical disks (e.g., CD and DVD disks).
- semiconductor memory devices e.g., EPROM, EEPROM, and flash memory devices
- magnetic disks e.g., internal hard disks or removable disks
- magneto-optical disks e.g., CD and DVD disks
- optical disks e.g., CD and DVD disks.
- the processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- the subject matter described herein can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, (e.g., a mouse or a trackball), by which the user can provide input to the computer.
- a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
- a keyboard and a pointing device e.g., a mouse or a trackball
- Other kinds of devices can be used to provide for interaction with a user as well.
- feedback provided to the user can be any form of sensory feedback, (e.g., visual feedback, auditory feedback, or tactile feedback), and input from the user can be received in any form, including acoustic, speech, or tactile input.
- the subject matter described herein can be implemented in a computing system that includes a back-end component (e.g., a data server), a middleware component (e.g., an application server), or a front-end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the subject matter described herein), or any combination of such back-end, middleware, and front-end components.
- the components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
- LAN local area network
- WAN wide area network
Abstract
Description
- This application claims the benefit of, prior U.S. Provisional Application No. 61/321,297, filed Apr. 6, 2010, which is incorporated by reference herein in its entirety.
- Today, digital cameras have proliferated to level unimaginable only a few short years ago. Not only are digital cameras stand-alone devices, they are also features that are added to numerous electronic products. Digital cameras can now be found on, for example, portable electronic devices such as mobile phones, IPODS, and IPHONES. Typically, however, digital cameras found on portable electronic devices suffer from limitations such as limited field of view, limited resolution, and small aperture.
- In general, in an aspect, the embodiments of the invention may provide a system including a camera that includes a lens having an aperture, a tripod configured to support the camera, a mount configured to be coupled the tripod and to the camera, the mount including a pivot that is configured such that the camera is rotatable around an axis that is aligned with the aperture of the camera, a motor configured to control the rotation of the mount, a memory containing computer readable instructions, that when executed by a processor, cause the processor to, calculate a desired position of the camera using sensor information, control the motor such that the camera is moved to the desired position, control the shutter of the camera to capture an image, and store the captured image.
- Implementations of the invention can provide one or more of the following features. The processor is included in the camera. The motor is configured to control the rotation of the mount around a vertical axis that is aligned with the aperture of the camera. The computer readable instructions are further configured to cause the processor to capture a plurality of images that are combinable to create a composite image. The composite image is selected from the group consisting of a panoramic image, a cylindrical image, a spherical image, and a high dynamic range image. The computer readable instructions are further configured to cause the processor to combine the plurality of images to create a composite image.
- In general, in another aspect, embodiments of the invention may provide a non-transitory computer readable medium including computer readable instructions that, when executed by a processor, are configured to cause the processor to control the operation of a camera, collect information from a sensor in the camera, process the sensor information to calculate a position of the camera, control a motor attached to the camera to position the camera using the calculated position, wherein the motor is configured to rotate the camera around an axis that is aligned with an aperture of the camera, capture an image using the camera, and store the captured image.
- Implementations of the invention can provide one or more of the following features. The computer readable instructions are executed in a processor included in the camera. Controlling the motor includes controlling the rotation of the camera around a vertical axis that is aligned with the aperture of the camera. The computer readable instructions are further configured to cause the processor to capture a plurality of images that are combinable to create a composite image. The computer readable instructions are further configured to cause the processor to create a plurality of images that are combinable into at least one of a panoramic image, a cylindrical image, an spherical image, and a high dynamic range image. The computer readable instructions are further configured to cause the processor to combine the plurality of images to create a composite image.
- Various aspects of the invention may provide one or more of the following capabilities. Multi-picture composite images can be created. Panoramic, cylindrical, and spherical images can be captured. A camera can be automatically controlled and positioned using an image application. A camera can provide information to an image application that is used to control the camera. Images captured by the camera can be stored remotely in the cloud. A user can be provided guidance of how to capture successive images used to create a multi-picture composite image. Images captured by the camera can be backed up by automatically synchronizing uploading the images to a storage location using any available communications wireless network.
- These and other capabilities of the invention, along with the invention itself, will be more fully understood after a review of the following figures, detailed description, and claims.
-
FIG. 1 is a diagram of a system for capturing images. -
FIG. 2 is a diagram of a portion of the system shown inFIG. 1 . -
FIG. 3 is a diagram of a portion of the system shown inFIG. 1 . -
FIG. 4 is a diagram of a system for capturing images. -
FIG. 5 is a diagram of a system for capturing images. -
FIG. 6 is a diagram of a portion of the system shown inFIG. 1 . -
FIGS. 7-11 are exemplary screen shots of a computer application for use with a camera. -
FIG. 12 is a process flow diagram of a process for use with a camera. - Embodiments of the invention provide techniques for capturing images using a smartphone. In one embodiment, the smartphone is supported on a mount that is configured such that the smartphone can rotate about several axes that aligned with the focal plane/point of a camera included in the smart phone. Some embodiments include a motorized mount that can be configured to automatically rotate the smartphone to capture images in an automated fashion. An image capture application can also be provided that is configured to work with and control the mount and to control the smartphone to capture images. The image capture application can be configured automatically capture a series of images using the smartphone that can be combined into a single image. The image capture application can also be configured to automatically copy media files to a remote storage location. The image capture application can also be configured to allow a user to edit and/or manipulate images captured by the smart phone. The image capture application can also be configured to register the date, time, location, direction, and camera orientation of every image and/or video that is captured. This information can then be attached to photographs and videos that are captured such that it can easily be organized and searched at a later time. Other embodiments are within the scope of the invention.
- Referring to
FIGS. 1-6 , anexemplary system 5 includes acamera 10, acase 15, amount 20, and atripod 25. WhileFIG. 1 shows an APPLE IPHONE as thecamera 10, other cameras can be used. - The
camera 10 is preferably a smart camera that is configured to take pictures and/or capture video. Thecamera 10 preferably includes a microprocessor on which a third-party application can be executed. Thecamera 10 can be a stand-alone camera or can be part of another device such as, a smart phone, MP3 player, or tablet PC. Examples of thecamera 10 are APPLE IPHONE, APPLE IPAD, RIM BLACKBERRY, APPLE IPOD, and DROID INCREDIBLE. Preferably, thecamera 10 can be configured to download and execute applications provided by third parties (e.g., via an Internet connection). For example, the APPLE IPHONE is configured to download applications from an application store. The combination of the camera functionality and the ability to execute third-party applications can create novel functionality and enable users to create images of varying quality and type that can be shared online. - The
mount 20 is preferably configured to connect to thecamera 10 such that themount 20 can support and/or maneuver thecamera 10. Themount 20 can connect directly to a fitting in thecamera 10, and/or can use an adapter such as acase 15 that is configured to fit thecamera 10. Preferably, themount 20 includes one or more gimbals that are configured such that it creates two rotational planes that intersect at the aperture and/or focal plane of the lens of thecamera 10. In this configuration, parallax that results from rotation of thecamera 10 can be reduced or eliminated. Themount 20 can be set on a stable surface and/or can be used in conjunction with a tripod (or other hardware used to eliminate vibrations) to provide manual and/or motorized control of the rotation of thecamera 10. Themount 20 can also function as a handle for thecamera 10. - An exemplary version of the
mount 20 includespivots camera 10 to rotate about the y-axis, thepivot 60 is configured to allow thecamera 10 to rotate about the x-axis, and thepivot 65 is configured to allow thecamera 10 to rotate about the z-axis. Thepivots pivots - The
mount 20 can also include one ormore motors 70 that are configured to rotate thecamera 10 under the control of a computerized application. Preferably, the motors used to rotate thecamera 10 are small motors requiring little electrical power. The motor can be configured to rotate thecamera 10 on one or more pivot points of the mount 20 (e.g., x, y, and z axes). Preferably, power is provided to the motor by abattery pack 75, and/or an external power connection. Motorized embodiments of themount 20 can be configured in various ways. For example, there can be asingle motor 70 configured to rotate thecamera 10 about a single axis ormultiple motors 70 can be used to rotate thecamera 10 about more than one axis. For example, a panorama mount can automate image capture in the horizontal plane while the vertical panning/tilt angle can be adjusted manually to capture a virtual reality or spherical image. A virtual reality mount can be motorized on both the vertical and horizontal plane and can fully automate capturing a spherical image. - The
mount 20 can be configured to communicate with one or more other devices. For example, themount 20 can be configured to communicate with thecamera 10, a remote computer, and/or remote control using a wireless (e.g., Infrared, Bluetooth, 802.11, etc) and/or wired connection (e.g., a multi-pin connector). The remote control can be, for example, a small traditional handheld unit, or another processor such as a smart phone, PC, and/or tablet operating a control application. The communication connections can be used to control the mount 20 (e.g., to control the direction and speed of movement in the vertical and/or horizontal planes) and/or to control thecamera 10 itself (e.g., to control the parameters of image capture). Thecamera 10 can also use the connection with themount 20 to provide information collected by sensors in thecamera 10. For example, thecamera 10 can provide information captured by onboard sensors (e.g., a gyroscope, clock, GPS receiver, and/or compass). - The
mount 20 and/or thecamera 10 can be configured to be controlled by aremote control 80. For example, theremote control 80 can be used to remotely control how and when thecamera 10 takes a picture. Theremote control 80 can also be used to control the movement of a motorized version of themount 20. Theremote control 80 can also be used for “follow me” functionality where themount 20 is configured to automatically point the camera at the remote so that, for example, the person holding theremote control 80 is always in the picture and/or video being captured by thecamera 10. The remote control can also be configured to snap to themount 20 when not in use. - The
mount 20 can include additional features that make control of thecamera 10 easier. For example, themount 20 can include a detent on the pivot for the vertical plane that can lock thecamera 10 in position to simplify initial setup. This can be used in conjunction with, for example, an electronic version of a bubble level in an image application and can be used to level the mount when adjusting a tripod or other camera stabilization device. Themount 20 can also be configured to easily switch thecamera 10 between portrait and landscape modes. - The
tripod 25 can be a standard tripod. For example, thetripod 25 can includelegs 30 that are used to support to stabilize thecamera 10. Themount 20 and/ortripod 25 can also be configured to function as a handle when not being used to support thecamera 10. One of the three legs of the tripod can be configured as a larger leg such that the other two smaller legs, when folded, fit inside the larger leg (e.g., as shown inFIG. 6 ). Thetripod 25 can also include a power source used to power the mount 20 (e.g., battery pack). - The
system 5 is preferably configured to work with an image application. The image application can be executed by thecamera 10 and/or another processor (e.g., a remote computer). The user can preferably interface with the image application via a display on thecamera 10 and/or via an external display. The image application can preferably be used to control thecamera 10 and/or themount 20. The image application can also preferably be used to receive images captured by thecamera 10 and to process the same. The image application can also be configured to allow user to edit images captured by thecamera 10. The image application can be, for example, an application that is configured to run on an APPLE IPHONE. The image application can be configured to communicate with network servers via a wired and/or wireless network connection (e.g., a 3G connection provided by a cellular service provider). The image application can be configured to process and upload images captured by thecamera 10. For example, in a cloud computing system, the image application can be configured to upload images to a remote server in the cloud. The image application can also be configured to automatically communicate with a remote network-based application that can process images and can be used for editing by the user. - The image application can be configured to use information provided by the
camera 10 to control thecamera 10. For example, the image application can be configured to receive information from sensors contained in thecamera 10 to control thecamera 10 and/or themount 20. For example, the image application can be configured to use location information, GPS information, azimuth, orientation, brightness, acceleration, and sound to control thecamera 10. The image application can be configured to receive sensor information from thecamera 10 that can be used to calculate the position of thecamera 10. Using the position information, the image application can be configured to control thecamera 10 to, for example, take a series of consecutive images that can then be stitched together later to create a panoramic image. - In one example, the image application can be configured to control the capture of photos by the
camera 10. The image application can be configured to use additional information to control the parameters of the image capture itself (e.g., using brightness information provided by thecamera 10 to control shutter speed and aperture of the camera 10). For example, the focus, aperture, and shutter speed of thecamera 10 can be set to custom values based on algorithms that evaluate the image and/or can be based on controls and readings available to thecamera 10. The image application can be configured to use many different types of information in order to control thecamera 10. For example, the image application can use information provided by the camera 10 (e.g., GPS information, brightness information, field of view, aperture, shutter speed, focus, gyroscope information, orientation information, etc.). - The image application can be configured to take multiple images that are combined to create a processed picture such as a panoramic, cylindrical, spherical, high dynamic range (HDR), and virtual reality images. This process can be improved by the fact that the
mount 20 rotates about axes that are aligned with one or more focal planes of thecamera 10. The image application can be configured to process the multiple images on thecamera 10 and/or at a remote computer (e.g., a remote webpage or using third-party image editing software). Sensor data from thecamera 10 can be added to the images (e.g., as metadata) that are collected by thecamera 10 order to combine the multiple images, and/or for other processing. As an example, azimuth and GPS information can be used by mapping software to show the direction the camera was pointed when an image was captured. Other metadata can be included as well, such as time, date, latitude, longitude, shutter speed, aperture, white balance, lens information, etc. - The image application can be configured to capture a panoramic, cylindrical, and/or spherical image by collecting and combining multiple images taken by the
camera 10. Preferably, the image application uses field of view information provided by thecamera 10 to determine where to position thecamera 10 to collect all of the images that are used to create a completed panoramic, cylindrical, and/or spherical image. For example, the image application can be configured to control themount 20 such that all of the images used to create a final panoramic, cylindrical, and/or spherical image are automatically collected. In addition, the image application can also be configured to provide feedback to the user of thecamera 10 to indicate where to move thecamera 10 to capture each of the images used to create a final panoramic, cylindrical, and/or spherical image. For example, audible and/or physical indications can be used to alert the user of thecamera 10 where the next image should be taken. - The image application can be configured to control the
camera 10 and/or mount 20 via, for example, a wired and/or wireless connection. For example, each of the following types of connections can be used: a charging/base connector, a Bluetooth connection, a WiFi connection, an infrared transceiver, a headset jack in the camera 10 (e.g., using the Left/right/common/ground leads), and/or other proprietary connectors. The headset jack, if present on thecamera 10, can also be connected to an infrared transceiver and used to control thecamera 10 and/ormount 20. Audio signals created by thecamera 10 can also be used as a wireless link to control themount 20. - The image application,
camera 10, and/or mount 20 can be configured for “follow-me” operation. In one example of follow me operation, facial recognition can be used to track the location of a person. In another example of follow me operation, a remote control can be configured to emit a beacon of light that can be used to track the location of the remote control (e.g., keeping the beacon in the center of the frame). The light can be at any wavelength that can be detected by sensors used with the image application and/orcamera 10. The sensors can include, for example, sensors configured to detect UV and IR bands of electromagnetic radiation. - The image application can be configured to provide a setup mode. In the setup mode, the
camera 10 is preferably oriented flat and a bubble level utilizing sensors in thecamera 10 can be used to level the camera by adjusting, for example, the head of a tripod. - The image application can be configured to work in various configurations. In a first exemplary configuration, the image application can be executed locally on the
camera 10 and use the memory contained in thecamera 10 to store images. Preferably, in this configuration, the image application is executed locally on thecamera 10 and the user is able to interface with the image application using the camera 10 (e.g., via a touchscreen interface on the camera 10). The user of thecamera 10 can preferably take and edit pictures using thecamera 10, and can preferably store pictures in a memory contained in thecamera 10. - In a second exemplary configuration, the image application can be executed locally on the
camera 10 and use a remote memory (e.g., in the cloud) to store images. In this configuration, image application is preferably executed locally on thecamera 10 and the user is able interface with the image application using the camera 10 (e.g., via a touchscreen interface on the camera 10). The user can preferably use the image application to take and edit pictures using thecamera 10. The image application is preferably configured such that the user can store pictures taken and/or edited by thecamera 10 in a remotely located memory (e.g., in the cloud). For example, the remotely located memory can be a hard drive that is located in a server that is accessible via the Internet. - In a third exemplary configuration, the image application can be executed remotely from the camera 10 (e.g., in a server in the cloud) and can use a memory that is located remotely from the camera 10 (e.g., a memory in the cloud) to store images. In this configuration, the image application is preferably executed on a remotely located processor. For example, the processor can be in physical proximity with the
camera 10 and/or accessible over the Internet (e.g., via a webpage). The user of thecamera 10 can interact with the image application via, for example, a computing device having a network connection capable of communicating with the computer executing the image application. Using the image application, the user can cause thecamera 10 to take pictures. The user can also preferably edit previously taken pictures and store those pictures in a remotely located memory (e.g., in the cloud). - The image application can be configured to allow users to customize and save a set of standard image profiles. These image profiles can be uploaded to the cloud to enable a user to access the image profiles regardless of the location of the user. Preferably, the image profiles can also be edited online and downloaded (or pushed) to the image application. To the extent that multiple users have created image profiles and save these to a centrally located server, data mining applications in the centrally located server can be configured to use the save profiles to refine the default settings of the image application.
- The following paragraphs describe one exemplary method of operation of the image application. The exemplary method of operation may be altered by, for example, removing certain functionality and/or adding additional functionality. Exemplary screen shots of the image application are shown in
FIGS. 7-11 ). - The application preferably starts with a screen briefly showing the logo, product name, version information, and, after a brief timeout, starts where the user left off in a project, or at the project list on the studio screen if there is no saved state, or if saved state is unusable. A splash screen is not required, and, if used, can vary from that described above. An exemplary splash screen is shown in
FIG. 2 . - The Studio screen preferably shows a list of projects (e.g., as shown in
FIG. 6 ). If there are no projects, it can show text prompting the user to create a project using a menu on this screen (e.g., as shown inFIG. 8 ). Projects can organize images, how they are processed and combined (transforms), and the outputs or exports of the projects (e.g., jpg files, QuickTime VR, RAW, etc.). Projects are not required to be single documents. A project can refer to images that are stored in, for example, the system's image database(s). Moving a project can involve a special type of “export” which is an archive containing project data and copies of images imported and/or acquired as the inputs to the project. - A project is preferably a collection of images and compound images. The outputs, or exports, from a project are images in various formats that are, in general, meant to be HDR images and/or panoramas, though, an output could also be a single, unprocessed image. Projects preferably start with either an “import” of images, or by acquiring images (e.g., using the camera 10). At any point in a project, additional images can preferably be imported or acquired. After images are imported or acquired, the user can create an output by processing images (e.g., specifying transforms on images) and/or organizing them into composite images. Composite images can be designated as outputs or exports by specifying formats for the resulting outputs or exports.
- Images and Composite Images
- Multiple images captured by the
camera 10 can be combined into composite images. Composite images can be multiple images from the same perspective combined into a single image, or they can be, for example, multiple images combined, or stitched, into a mosaic, or panorama. In one example, multiple single images are combined into a panorama or HDR image. In other examples, each image in a panorama can be a composite image. - Panoramas are typically multiple images in single or multiple rows. If the rows of a panorama are taken from perspectives covering 360 degrees around a single point, a cylindrical panorama can be produced. Spherical panoramas are also possible.
- There are preferably two kinds of operations on images: transforms and grouping. Transforms can alter an image or compound image, such as a blur filter, perspective shift, etc. The available set of transforms should facilitate creating HDR images and panoramas. Transforms can operate on whole and/or partial images. Operations can be performed by direct manipulation when possible (e.g., dragging images into groups, and context menus enabling a selection of transforms).
- The project screen can display images and composite images in a way that visually represents the content of those entities (e.g., as shown in
FIG. 9 ). For example, individual images can be represented by thumbnail images. Composite images consisting of images taken from the same perspective can be visually represented by a “stack” of thumbnail images. Groups of images comprising a panorama can be shown as thumbnails grouped into the rows of the panorama. - Setup of the
camera 10 can typically be performed at any time up until a shot is started. Setup of thecamera 10 can include leveling themount 20. To level themount 20, a level indicator with bubble levels can be shown on a screen of thecamera 10, and a tone can be played when the mount is level. If the mount is not level, a warning tone and/or a warning dialog can appear when the user starts a shot. The user can preferably choose to permanently dismiss the warning dialog. - Preferably, the first step in shooting is choosing the type of shot. Shots can be, for example, single frames with one or more images and/or panoramas with multiple frames, with each frame including one or more images. A menu of shot types can be presented, each one a variation on the two basic types. If a previously stored shot type has been chosen, and the user does not want to modify it, the user is ready to start the shot. If the user wants to modify a shot setup or create one from scratch, the user can navigate to the shot setup screen by choosing to modify a shot setup, or to create one. The default setups, and any the user creates, can be stored in the program data, backed up to a remove server (e.g., in the cloud), and are typically not part of a project. Shot setups can preferably be used with any project.
- Using the shot setup screens (e.g., as shown in
FIG. 10 ), the user can preferably edit the programming for a shot. The programming that the user can perform can vary for the different types of shots. For a single frame, the display can show a viewfinder plus controls for setting up a frame consisting of multiple images to produce HDR, super-resolution, and other multiple image effects from a single perspective. When editing the relationship of multiple frames in a panorama, the editor can show the position of each frame of a panorama, information about the setup of the individual frames, and controls to set the parameters for the panorama (e.g., cylindrical or less than 360 degree, number of frames, number of rows). - The shooting screen can be displayed after the shot has been set up. In this screen, the user can preferably configure how the image is captured by the camera. For example, the user can be given the opportunity to release the shutter immediately, after some preset delay, after a specified delay. Other options to capture the image are also possible. For example, the user can capture an image by pressing the call-answer/end-call button on a Bluetooth headset paired with the device containing the camera 10 (e.g., an IPHONE), and/or by using an infrared remote. While a “shutter” has been discussed above, this is not to imply that a physical shutter is required. Other shuttering techniques are possible (e.g., an electronic shutter).
- Preferably, every image in a project can be edited. Editing can specify a set of transforms applied to the image. The image application can be configured such that the original image is always retained (e.g., backed up), and transforms can preferably be removed at any later time, reverting the image to its condition before the transform was applied.
- Deleted images are preferably listed in a “trash” list. The images in the trash can be permanently deleted in low resource conditions, or explicitly on the user's command.
- The application can be configured to warn users away from performing operations that are not strictly incorrect, but that are unlikely to work or produce a good image. For example, adding images not taken from the same point to a panorama, or not taken from the same perspective to a composite image from which an HDR image can be derived. These warnings should be given throughout the program when the user performs an operation that can be inferred to have an undesirable result, such as deleting an image that was previously selected for saving, and/or creating a shot setup with what could be nonsensical parameters (e.g., too much, or too little overlap, unusual exposure settings, etc.).
- The functionality described herein can also be implemented on a server that is accessible via the Internet, or other network. For example, a server can be accessed by an IPHONE over a cellular network. The image application can be configured with some differences when implemented as a web-based application. For example, differences in functions in a Web version of the image application (e.g., relative to a locally hosted version of the application) can include specifying a part of an image for applying a transform, showing panels instead of separate screens for lists of projects, groups, etc. Performance requirements and algorithm implementation in web-based versions of the image application can be varied (e.g., as Web servers generally have more computing power than a portable device, more CPU-intense processes can be used).
- The “studio” can relate to image creation. One exemplary organizational hierarchy can be: layer>image>canvas>project. A layer is preferably a single image. An image is preferably a set of layers from a single perspective. Layers in the Image might include differences in exposure, focus, etc. The assumption is the layers will be edited and merged down. A canvas is preferably a collection of layers and images that are organized. The assumption is that the images will be stitched and/or merged as a part of a larger image. A project is preferably a collection of layers, images and canvases that are logically organized based on a project. The user can start by creating a new project or opening an existing one. The user can then make the decision to create an image or canvas. At this point, the difference between an image and a canvas is whether or not the camera will be fixed or moved.
- The user can preferably now select whether to set the focus, aperture, and exposure settings. Options can include Manual (e.g., each variable is set manually or using the floating target box and the brackets), Recommend (e.g., the image application analyzes the image and makes recommendations on bracketing and settings), and Automatic (e.g., the user can simply operate the shutter and the recommended settings are put in effect based on algorithms that made decisions relating to bracketing, focus, and exposure).
- This process is preferably the same as creating an image except that the canvas is preferably defined and the brackets are preferably made based on first sampling the canvas. When the canvas has been sampled the brackets will preferably be set for the composite canvas and applied (if necessary) to the individual images. The canvas can be set manually by moving entering “SetUp” and stepping the camera left and right to the end points and clicking on the shutter. The canvas can be specified as 360 degree Pano (e.g., the
camera 10 preferably steps through a full circle to capture images (e.g., 14 steps)), Custom Pano (e.g., the image application preferably instructs the photographer to use the motor control to move the camera to the left and right limits of the image and then press the shutter button), Spherical (e.g., thecamera 10 is preferably able to create three rows of images—one row above the ground plane and one below. Setting these rows will comprise of manually panning the camera, perhaps with an audible detent). - In operation, referring to
FIG. 12 , with further reference toFIGS. 1-11 , aprocess 100 for capturing a panoramic image using thesystem 5 includes the stages shown. Theprocess 100, however, is exemplary only and not limiting. Theprocess 100 may be altered, e.g., by having stages added, removed, or rearranged. - At
stage 105, the user creates a new project in the image application. The project can be given an arbitrary name and/or assigned a name by the image application. - At
stage 110, the user captures one or more photographs and/or video using thecamera 10. Capture of the photographs and/or video can be done manually and/or under the control of the image application (e.g., being executed in the camera 10). - In a configuration where the user manually positions the
camera 10, preferably, the display of thecamera 10 includes analignment icon 205 andalignment grid 210. As each picture is taken, one of the cells in thealignment grid 210 preferably turns a different color to indicate that the corresponding image has been captured. After a first image is captured, the user moves thecamera 10 to the next location. Preferably, the user is alerted of the location of the next picture by keeping the alignment icon in the center of the screen of thecamera 10. In some embodiments, thealignment grid 210 can change colors when the camera is properly aligned for the next shot. As successive pictures are taken, individual cells of thealignment grid 210 change color to indicate a picture for that location has been captured. The user can preferably see, at a glance, which pictures are missing from the pictures needed to create a composite image. - In a configuration where the
camera 10 controls a motorized version of themount 20, thecamera 10 is configured to calculate the positioning of thecamera 10. The calculations can be accomplished using, for example, sensors that are present in thecamera 10, such as, GPS, accelerometer, compass, facial recognition, etc. Thecamera 10 can calculate the various positions needed to produce a final composite image and control themount 20 accordingly. Thecamera 10 can automatically capture the desired images as themount 20 moves thecamera 10. - At
stage 115, images captured by thecamera 10 are saved to a memory. The storage can be present locally in thecamera 10 and/or can be located remotely over a network connection. Each individual image captured by thecamera 10 can be saved as captured, or saved in a bulk manner. The image application can be configured to save some pictures in thecamera 10 and to save others (e.g. backup copies) in a remote storage location. - At
stage 120, images captured by thecamera 10 are processed by the image application. Preferably, the image application merges multiple images captured by thecamera 10 in order to create a composite image such as a panoramic, spherical, cylindrical, VR, and/or HDR image. This can be accomplished by, for example, stitching together the multiple images captured by thecamera 10. The processing can be performed locally on thecamera 10 and/or remotely (e.g., in a remote processor and/or in the cloud). - Other embodiments are within the scope and spirit of the invention.
- The subject matter described herein can be implemented in digital electronic circuitry, or in computer software, firmware, or hardware, including the structural means disclosed in this specification and structural equivalents thereof, or in combinations of them. The subject matter described herein can be implemented as one or more computer program products, such as one or more computer programs tangibly embodied in an information carrier (e.g., in a machine-readable storage device), or embodied in a propagated signal, for execution by, or to control the operation of, data processing apparatus (e.g., a programmable processor, a computer, or multiple computers). A computer program (also known as a program, software, software application, or code) can be written in any form of programming language, including compiled or interpreted languages, and it can be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment. A computer program does not necessarily correspond to a file. A program can be stored in a portion of a file that holds other programs or data, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code). A computer program can be deployed to be executed on one computer or on multiple computers at one site or distributed across multiple sites and interconnected by a communication network.
- The processes and logic flows described in this specification, including the method steps of the subject matter described herein, can be performed by one or more programmable processors executing one or more computer programs to perform functions of the subject matter described herein by operating on input data and generating output. The processes and logic flows can also be performed by, and apparatus of the subject matter described herein can be implemented as, special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application-specific integrated circuit).
- Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processor of any kind of digital computer. Generally, a processor will receive instructions and data from a read-only memory or a random access memory or both. The essential elements of a computer are a processor for executing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto-optical disks, or optical disks. Information carriers suitable for embodying computer program instructions and data include all forms of non-volatile memory, including by way of example semiconductor memory devices, (e.g., EPROM, EEPROM, and flash memory devices); magnetic disks, (e.g., internal hard disks or removable disks); magneto-optical disks; and optical disks (e.g., CD and DVD disks). The processor and the memory can be supplemented by, or incorporated in, special purpose logic circuitry.
- To provide for interaction with a user, the subject matter described herein can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor, for displaying information to the user and a keyboard and a pointing device, (e.g., a mouse or a trackball), by which the user can provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well. For example, feedback provided to the user can be any form of sensory feedback, (e.g., visual feedback, auditory feedback, or tactile feedback), and input from the user can be received in any form, including acoustic, speech, or tactile input.
- The subject matter described herein can be implemented in a computing system that includes a back-end component (e.g., a data server), a middleware component (e.g., an application server), or a front-end component (e.g., a client computer having a graphical user interface or a web browser through which a user can interact with an implementation of the subject matter described herein), or any combination of such back-end, middleware, and front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include a local area network (“LAN”) and a wide area network (“WAN”), e.g., the Internet.
- While the foregoing application primarily discusses the
camera 10 capturing photographs, it will be recognized that video images can also be captured and processed. - It is noted that one or more references are incorporated herein. To the extent that any of the incorporated material is inconsistent with the present disclosure, the present disclosure shall control. Furthermore, to the extent necessary, material incorporated by reference herein should be disregarded if necessary to preserve the validity of the claims.
- Further, while the description above refers to the invention, the description may include more than one invention.
Claims (12)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/081,354 US9213220B2 (en) | 2010-04-06 | 2011-04-06 | Camera control |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US32129710P | 2010-04-06 | 2010-04-06 | |
US13/081,354 US9213220B2 (en) | 2010-04-06 | 2011-04-06 | Camera control |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120062691A1 true US20120062691A1 (en) | 2012-03-15 |
US9213220B2 US9213220B2 (en) | 2015-12-15 |
Family
ID=44763270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/081,354 Active - Reinstated 2032-09-06 US9213220B2 (en) | 2010-04-06 | 2011-04-06 | Camera control |
Country Status (2)
Country | Link |
---|---|
US (1) | US9213220B2 (en) |
WO (1) | WO2011127201A1 (en) |
Cited By (89)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100208131A1 (en) * | 2007-11-14 | 2010-08-19 | Stelvio Zarpellon | Orientable head for supporting video-photographic equipment |
US20120002075A1 (en) * | 2010-07-05 | 2012-01-05 | Sony Corporation | Imaging control apparatus, imaging control method, and program |
US20130314699A1 (en) * | 2012-05-25 | 2013-11-28 | Solmetric Corp. | Solar resource measurement system |
US20140022364A1 (en) * | 2012-07-20 | 2014-01-23 | Freedom Scientific, Inc. | Multiposition Magnifier Camera |
US20140135062A1 (en) * | 2011-11-14 | 2014-05-15 | JoeBen Bevirt | Positioning apparatus for photographic and video imaging and recording and system utilizing same |
US20140168356A1 (en) * | 2012-12-17 | 2014-06-19 | Egos Ventures Inc. | Vibration-induced rotation of electronic devices |
US20140184781A1 (en) * | 2012-12-29 | 2014-07-03 | Honeywell International Inc. | Furnace tube inspection |
US8807849B2 (en) | 2011-10-12 | 2014-08-19 | Padcaster Llc | Frame and insert for mounting mobile device to a tripod |
US20140320679A1 (en) * | 2013-04-24 | 2014-10-30 | Microsoft Corporation | Motion blur avoidance |
CN104365083A (en) * | 2012-06-06 | 2015-02-18 | 索尼公司 | Image processing device, image processing method, and program |
US8971698B2 (en) * | 2012-03-06 | 2015-03-03 | Earl J PeQueen | Detachable viewfinder |
US20150062291A1 (en) * | 2013-09-05 | 2015-03-05 | Lg Electronics Inc. | Mobile terminal and control method therof |
US20150163393A1 (en) * | 2013-12-06 | 2015-06-11 | Jwin Electronics Corp. | Remote control built-in device case for electronic device |
USD732545S1 (en) | 2014-06-06 | 2015-06-23 | Padcaster Llc | Tablet holder |
US9074847B1 (en) * | 2014-08-28 | 2015-07-07 | Flex Force Enterprises LLC | Stabilized weapon platform with active sense and adaptive motion control |
US20150241713A1 (en) * | 2014-02-21 | 2015-08-27 | The Lightco Inc. | Camera holder methods and apparatus |
US20150288857A1 (en) * | 2014-04-07 | 2015-10-08 | Microsoft Corporation | Mount that facilitates positioning and orienting a mobile computing device |
WO2015173828A1 (en) * | 2014-05-15 | 2015-11-19 | Showbox Ltd. | Methods, circuits, devices, systems and associated computer executable code for composing composite content |
US20160014921A1 (en) * | 2012-12-17 | 2016-01-14 | Egos Ventures Inc. | System and method of using a recharger plug as a stand for electronic devices |
EP2863622A4 (en) * | 2012-06-13 | 2016-03-02 | Sony Corp | Recommending device, method, and program |
WO2016061011A3 (en) * | 2014-10-15 | 2016-06-09 | Microsoft Technology Licensing, Llc | Camera capture recommendation for applications |
US9458963B1 (en) * | 2015-03-31 | 2016-10-04 | Swiftcam Technologies Group Company Limited | 360-degree gimbal system |
EP3086016A1 (en) * | 2015-04-22 | 2016-10-26 | Novona AG | Motorized camera holder |
CN106131402A (en) * | 2016-06-29 | 2016-11-16 | 广东欧珀移动通信有限公司 | A kind of self-shooting bar and self-heterodyne system |
US20170006205A1 (en) * | 2015-06-30 | 2017-01-05 | Guilin Feiyu Technology Corporation Ltd. | Stabilizer applicable for moving shooting |
US9544501B2 (en) | 2013-10-18 | 2017-01-10 | Light Labs Inc. | Methods and apparatus for implementing and/or using a camera device |
US9544503B2 (en) | 2014-12-30 | 2017-01-10 | Light Labs Inc. | Exposure control methods and apparatus |
US9547160B2 (en) | 2013-01-05 | 2017-01-17 | Light Labs Inc. | Methods and apparatus for capturing and/or processing images |
US9551854B2 (en) | 2013-10-18 | 2017-01-24 | Light Labs Inc. | Methods and apparatus for controlling sensors to capture images in a synchronized manner |
US9554031B2 (en) | 2013-12-31 | 2017-01-24 | Light Labs Inc. | Camera focusing related methods and apparatus |
US20170080875A1 (en) * | 2015-09-17 | 2017-03-23 | Hon Hai Precision Industry Co., Ltd. | Fastening apparatus |
US20170111586A1 (en) * | 2012-12-26 | 2017-04-20 | Sony Corporation | Image processing device and method, and program |
US20170161956A1 (en) * | 2015-12-02 | 2017-06-08 | Seiko Epson Corporation | Head-mounted display device and computer program |
US9686471B2 (en) | 2013-11-01 | 2017-06-20 | Light Labs Inc. | Methods and apparatus relating to image stabilization |
WO2017117039A1 (en) * | 2015-12-28 | 2017-07-06 | Sphere Optics Company, Llc | Omnidirectional catadioptric lens with odd aspheric contour or multi-lens |
US9736365B2 (en) | 2013-10-26 | 2017-08-15 | Light Labs Inc. | Zoom related methods and apparatus |
US9749511B2 (en) | 2013-10-18 | 2017-08-29 | Light Labs Inc. | Methods and apparatus relating to a camera including multiple optical chains |
US9749549B2 (en) | 2015-10-06 | 2017-08-29 | Light Labs Inc. | Methods and apparatus for facilitating selective blurring of one or more image portions |
US9800786B1 (en) * | 2016-04-20 | 2017-10-24 | Guilin Feiyu Technology Corporation Ltd. | Shooting apparatus with stabilizer module |
US9800778B2 (en) * | 2015-07-21 | 2017-10-24 | Qualcomm Incorporated | Camera orientation notification system |
US9824427B2 (en) | 2015-04-15 | 2017-11-21 | Light Labs Inc. | Methods and apparatus for generating a sharp image |
US9857584B2 (en) | 2015-04-17 | 2018-01-02 | Light Labs Inc. | Camera device methods, apparatus and components |
WO2018022197A1 (en) * | 2016-07-26 | 2018-02-01 | Qualcomm Incorporated | Systems and methods for compositing images |
US9912865B2 (en) | 2014-10-17 | 2018-03-06 | Light Labs Inc. | Methods and apparatus for supporting burst modes of camera operation |
US9930233B2 (en) | 2015-04-22 | 2018-03-27 | Light Labs Inc. | Filter mounting methods and apparatus and related camera apparatus |
US9967535B2 (en) | 2015-04-17 | 2018-05-08 | Light Labs Inc. | Methods and apparatus for reducing noise in images |
US9979878B2 (en) | 2014-02-21 | 2018-05-22 | Light Labs Inc. | Intuitive camera user interface methods and apparatus |
US9987184B2 (en) | 2013-02-05 | 2018-06-05 | Valentin Borovinov | Systems, methods, and media for providing video of a burial memorial |
US9998638B2 (en) | 2014-12-17 | 2018-06-12 | Light Labs Inc. | Methods and apparatus for implementing and using camera devices |
US10003738B2 (en) | 2015-12-18 | 2018-06-19 | Light Labs Inc. | Methods and apparatus for detecting and/or indicating a blocked sensor or camera module |
US10021286B2 (en) | 2011-11-14 | 2018-07-10 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
CN108351681A (en) * | 2015-12-15 | 2018-07-31 | 三星电子株式会社 | Electronic equipment and its bracket |
US10051182B2 (en) | 2015-10-05 | 2018-08-14 | Light Labs Inc. | Methods and apparatus for compensating for motion and/or changing light conditions during image capture |
US10075651B2 (en) | 2015-04-17 | 2018-09-11 | Light Labs Inc. | Methods and apparatus for capturing images using multiple camera modules in an efficient manner |
US10091447B2 (en) | 2015-04-17 | 2018-10-02 | Light Labs Inc. | Methods and apparatus for synchronizing readout of multiple image sensors |
US10110794B2 (en) | 2014-07-09 | 2018-10-23 | Light Labs Inc. | Camera device including multiple optical chains and related methods |
US20180316843A1 (en) * | 2011-11-14 | 2018-11-01 | Gopro, Inc. | Positioning Apparatus for Photographic and Video Imaging and Recording and System Utilizing the Same |
US10129483B2 (en) | 2015-06-23 | 2018-11-13 | Light Labs Inc. | Methods and apparatus for implementing zoom using one or more moveable camera modules |
US10192133B2 (en) | 2015-06-22 | 2019-01-29 | Seiko Epson Corporation | Marker, method of detecting position and pose of marker, and computer program |
US10191356B2 (en) | 2014-07-04 | 2019-01-29 | Light Labs Inc. | Methods and apparatus relating to detection and/or indicating a dirty lens condition |
US10192361B2 (en) | 2015-07-06 | 2019-01-29 | Seiko Epson Corporation | Head-mounted display device and computer program |
US10198865B2 (en) | 2014-07-10 | 2019-02-05 | Seiko Epson Corporation | HMD calibration with direct geometric modeling |
US10225445B2 (en) | 2015-12-18 | 2019-03-05 | Light Labs Inc. | Methods and apparatus for providing a camera lens or viewing point indicator |
US10284823B1 (en) * | 2016-10-12 | 2019-05-07 | Google Llc | Capturing photographs based on rotation of device |
US20190158719A1 (en) * | 2016-07-27 | 2019-05-23 | Guangdong Sirui Optical Co., Ltd. | Intelligent ball head and method for performing self‐photographing by using the same |
US10306218B2 (en) | 2016-03-22 | 2019-05-28 | Light Labs Inc. | Camera calibration apparatus and methods |
US10348970B2 (en) * | 2016-08-12 | 2019-07-09 | Lg Electronics Inc. | Mobile terminal and method of operating the same |
US10348874B2 (en) * | 2017-05-19 | 2019-07-09 | Conor Penfold | System and method for improving a photographic camera feature on a portable electronic device |
US10365480B2 (en) | 2015-08-27 | 2019-07-30 | Light Labs Inc. | Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices |
US10491806B2 (en) | 2015-08-03 | 2019-11-26 | Light Labs Inc. | Camera device control related methods and apparatus |
US10511749B2 (en) | 2016-02-26 | 2019-12-17 | Freedom Scientific, Inc. | Magnifier device with adjustable camera |
US10536671B1 (en) * | 2011-12-06 | 2020-01-14 | Musco Corporation | Apparatus, system and method for tracking subject with still or video camera |
CN111742544A (en) * | 2018-01-05 | 2020-10-02 | 高途乐公司 | Modular image capture system |
US10902609B2 (en) | 2016-06-06 | 2021-01-26 | Sz Dji Osmo Technology Co., Ltd. | Image processing for tracking |
US10931866B2 (en) | 2014-01-05 | 2021-02-23 | Light Labs Inc. | Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture |
US11106928B2 (en) * | 2016-06-06 | 2021-08-31 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
US11163289B2 (en) * | 2017-02-24 | 2021-11-02 | Sharp Kabushiki Kaisha | Control device, terminal device, cradle, notification system, control method, and storage medium |
US11184548B2 (en) * | 2015-12-22 | 2021-11-23 | Sz Dji Osmo Technology Co., Ltd. | Imaging device, and method and apparatus for controlling the imaging device |
US20210387347A1 (en) * | 2020-06-12 | 2021-12-16 | Selfie Snapper, Inc. | Robotic arm camera |
USD939607S1 (en) | 2020-07-10 | 2021-12-28 | Selfie Snapper, Inc. | Selfie camera |
US20220057695A1 (en) * | 2014-04-28 | 2022-02-24 | SZ DJI Technology Co., Ltd. | Interchangeable mounting platform |
US11283982B2 (en) | 2019-07-07 | 2022-03-22 | Selfie Snapper, Inc. | Selfie camera |
US11344107B1 (en) * | 2012-10-03 | 2022-05-31 | Dimitri Girault | Electronic handheld camera supporter |
US20220381394A1 (en) * | 2021-05-28 | 2022-12-01 | Peak Design | Mobile tripod mounting system |
US20230199316A1 (en) * | 2021-12-17 | 2023-06-22 | Matterport | Motor mount for image capture of surrounding environment |
US11687169B2 (en) | 2011-11-30 | 2023-06-27 | Ricoh Company, Ltd. | Image display control device, image display system, and computer program product |
US11727657B2 (en) | 2021-04-09 | 2023-08-15 | Carvana, LLC | Systems and methods for rendering a portion of a 3D display |
US11901841B2 (en) | 2019-12-31 | 2024-02-13 | Selfie Snapper, Inc. | Electroadhesion device with voltage control module |
US11962896B2 (en) | 2022-10-31 | 2024-04-16 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104137048B (en) | 2011-12-28 | 2019-10-25 | 诺基亚技术有限公司 | The opening example of application is provided |
US8996729B2 (en) | 2012-04-12 | 2015-03-31 | Nokia Corporation | Method and apparatus for synchronizing tasks performed by multiple devices |
WO2013101813A1 (en) * | 2011-12-28 | 2013-07-04 | Nokia Corporation | Camera control application |
EP2798483A1 (en) | 2011-12-28 | 2014-11-05 | Nokia Corporation | Application switcher |
JP2015156051A (en) * | 2012-06-06 | 2015-08-27 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US10021296B2 (en) * | 2013-12-31 | 2018-07-10 | Futurewei Technologies, Inc. | Automatic rotatable camera for panorama taking in mobile terminals |
WO2015179574A1 (en) * | 2014-05-20 | 2015-11-26 | Nextvr Inc. | Methods and apparatus including or for use with one or more cameras |
USD802578S1 (en) * | 2015-10-02 | 2017-11-14 | Alberto Rene Somoano | Tripod |
US9950897B2 (en) * | 2016-01-28 | 2018-04-24 | Wipro Limited | Apparatus for holding a card |
WO2018177564A1 (en) * | 2017-03-30 | 2018-10-04 | Grau Paniello Sergi | Mounts for mobile devices |
US11070712B2 (en) | 2019-08-30 | 2021-07-20 | Puwell Technology Llc | Method and system for control of a digital camera system |
US11003059B1 (en) | 2020-02-29 | 2021-05-11 | Rohan Taylor | Dynograph device |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3870264A (en) * | 1973-03-26 | 1975-03-11 | William I Robinson | Stand |
US6141034A (en) * | 1995-12-15 | 2000-10-31 | Immersive Media Co. | Immersive imaging method and apparatus |
US20040109059A1 (en) * | 2002-11-12 | 2004-06-10 | Kevin Kawakita | Hybrid joint photographer's experts group (JPEG) /moving picture experts group (MPEG) specialized security video camera |
US20060064273A1 (en) * | 2003-01-16 | 2006-03-23 | Ntt Infrastructure Network Corporation | Small automatic motion control photographing system |
US20070019946A1 (en) * | 2005-07-19 | 2007-01-25 | Clint Clemens | Methods for creating spherical imagery |
US7675539B2 (en) * | 2005-01-17 | 2010-03-09 | Sony Corporation | Camera control apparatus, camera system, electronic conference system, and camera control method |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006122189A2 (en) | 2005-05-10 | 2006-11-16 | Stafford Gregory R | Method, device and system for capturing digital images in a variety of settings and venues |
US20100045773A1 (en) * | 2007-11-06 | 2010-02-25 | Ritchey Kurtis J | Panoramic adapter system and method with spherical field-of-view coverage |
KR100758632B1 (en) | 2006-03-29 | 2007-09-14 | 삼성전자주식회사 | Apparatus and method for taking panoramic photograph |
CN101420525A (en) * | 2007-10-26 | 2009-04-29 | 鸿富锦精密工业(深圳)有限公司 | Photographing apparatus and method |
-
2011
- 2011-04-06 WO PCT/US2011/031459 patent/WO2011127201A1/en active Application Filing
- 2011-04-06 US US13/081,354 patent/US9213220B2/en active Active - Reinstated
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3870264A (en) * | 1973-03-26 | 1975-03-11 | William I Robinson | Stand |
US6141034A (en) * | 1995-12-15 | 2000-10-31 | Immersive Media Co. | Immersive imaging method and apparatus |
US20040109059A1 (en) * | 2002-11-12 | 2004-06-10 | Kevin Kawakita | Hybrid joint photographer's experts group (JPEG) /moving picture experts group (MPEG) specialized security video camera |
US20060064273A1 (en) * | 2003-01-16 | 2006-03-23 | Ntt Infrastructure Network Corporation | Small automatic motion control photographing system |
US7675539B2 (en) * | 2005-01-17 | 2010-03-09 | Sony Corporation | Camera control apparatus, camera system, electronic conference system, and camera control method |
US20070019946A1 (en) * | 2005-07-19 | 2007-01-25 | Clint Clemens | Methods for creating spherical imagery |
Cited By (140)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8848101B2 (en) * | 2007-11-14 | 2014-09-30 | Gitzo S.A. | Orientable head for supporting video-photographic equipment |
US20100208131A1 (en) * | 2007-11-14 | 2010-08-19 | Stelvio Zarpellon | Orientable head for supporting video-photographic equipment |
US20120002075A1 (en) * | 2010-07-05 | 2012-01-05 | Sony Corporation | Imaging control apparatus, imaging control method, and program |
US9716831B2 (en) * | 2010-07-05 | 2017-07-25 | Sony Corporation | Imaging control apparatus, imaging control method, and program |
US8807849B2 (en) | 2011-10-12 | 2014-08-19 | Padcaster Llc | Frame and insert for mounting mobile device to a tripod |
US9588405B2 (en) | 2011-10-12 | 2017-03-07 | Padcaster Llc | Frame and insert for mounting a mobile device to a tripod |
US9033595B2 (en) | 2011-10-12 | 2015-05-19 | Padcaster Llc | Frame and insert for mounting a mobile device to a tripod |
US10021286B2 (en) | 2011-11-14 | 2018-07-10 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
US20140135062A1 (en) * | 2011-11-14 | 2014-05-15 | JoeBen Bevirt | Positioning apparatus for photographic and video imaging and recording and system utilizing same |
US10791257B2 (en) | 2011-11-14 | 2020-09-29 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
US10462347B2 (en) | 2011-11-14 | 2019-10-29 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
US11489995B2 (en) | 2011-11-14 | 2022-11-01 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
US20180316843A1 (en) * | 2011-11-14 | 2018-11-01 | Gopro, Inc. | Positioning Apparatus for Photographic and Video Imaging and Recording and System Utilizing the Same |
US11687169B2 (en) | 2011-11-30 | 2023-06-27 | Ricoh Company, Ltd. | Image display control device, image display system, and computer program product |
US10536671B1 (en) * | 2011-12-06 | 2020-01-14 | Musco Corporation | Apparatus, system and method for tracking subject with still or video camera |
US8971698B2 (en) * | 2012-03-06 | 2015-03-03 | Earl J PeQueen | Detachable viewfinder |
US20130314699A1 (en) * | 2012-05-25 | 2013-11-28 | Solmetric Corp. | Solar resource measurement system |
US20150124049A1 (en) * | 2012-06-06 | 2015-05-07 | Sony Corporation | Image processing apparatus, image processing method, and program |
CN104365083A (en) * | 2012-06-06 | 2015-02-18 | 索尼公司 | Image processing device, image processing method, and program |
US11102402B2 (en) | 2012-06-06 | 2021-08-24 | Sony Corporation | Image processing apparatus, image processing method, and program |
US10091417B2 (en) * | 2012-06-06 | 2018-10-02 | Sony Corporation | Image processing apparatus, image processing method, and program |
EP2863622A4 (en) * | 2012-06-13 | 2016-03-02 | Sony Corp | Recommending device, method, and program |
US10708546B2 (en) | 2012-07-20 | 2020-07-07 | Freedom Scientific, Inc. | Multiposition magnifier camera |
US20140022364A1 (en) * | 2012-07-20 | 2014-01-23 | Freedom Scientific, Inc. | Multiposition Magnifier Camera |
US9992453B2 (en) * | 2012-07-20 | 2018-06-05 | Freedom Scientific, Inc. | Multiposition magnifier camera |
US11344107B1 (en) * | 2012-10-03 | 2022-05-31 | Dimitri Girault | Electronic handheld camera supporter |
US20140168356A1 (en) * | 2012-12-17 | 2014-06-19 | Egos Ventures Inc. | Vibration-induced rotation of electronic devices |
US9479698B2 (en) * | 2012-12-17 | 2016-10-25 | Egos Ventures Inc. | Vibration-induced rotation of electronic devices |
US20160014921A1 (en) * | 2012-12-17 | 2016-01-14 | Egos Ventures Inc. | System and method of using a recharger plug as a stand for electronic devices |
US10039198B2 (en) * | 2012-12-17 | 2018-07-31 | Egos Ventures Inc. | System and method of using a recharger plug as a stand for electronic devices |
US20170111586A1 (en) * | 2012-12-26 | 2017-04-20 | Sony Corporation | Image processing device and method, and program |
US10110817B2 (en) * | 2012-12-26 | 2018-10-23 | Sony Corporation | Image processing device and method, and program for correcting an imaging direction |
US20140184781A1 (en) * | 2012-12-29 | 2014-07-03 | Honeywell International Inc. | Furnace tube inspection |
US9547160B2 (en) | 2013-01-05 | 2017-01-17 | Light Labs Inc. | Methods and apparatus for capturing and/or processing images |
US9690079B2 (en) | 2013-01-05 | 2017-06-27 | Light Labs Inc. | Camera methods and apparatus using optical chain modules which alter the direction of received light |
US9671595B2 (en) | 2013-01-05 | 2017-06-06 | Light Labs Inc. | Methods and apparatus for using multiple optical chains in paralell |
US9568713B2 (en) | 2013-01-05 | 2017-02-14 | Light Labs Inc. | Methods and apparatus for using multiple optical chains in parallel to support separate color-capture |
US9987184B2 (en) | 2013-02-05 | 2018-06-05 | Valentin Borovinov | Systems, methods, and media for providing video of a burial memorial |
US9538083B2 (en) | 2013-04-24 | 2017-01-03 | Microsoft Technology Licensing, Llc | Motion blur avoidance |
US20140320679A1 (en) * | 2013-04-24 | 2014-10-30 | Microsoft Corporation | Motion blur avoidance |
US9232137B2 (en) * | 2013-04-24 | 2016-01-05 | Microsoft Technology Licensing, Llc | Motion blur avoidance |
US20150062291A1 (en) * | 2013-09-05 | 2015-03-05 | Lg Electronics Inc. | Mobile terminal and control method therof |
US9549127B2 (en) | 2013-10-18 | 2017-01-17 | Light Labs Inc. | Image capture control methods and apparatus |
US9851527B2 (en) | 2013-10-18 | 2017-12-26 | Light Labs Inc. | Methods and apparatus for capturing and/or combining images |
US9563033B2 (en) | 2013-10-18 | 2017-02-07 | Light Labs Inc. | Methods and apparatus for capturing images and/or for using captured images |
US9557519B2 (en) | 2013-10-18 | 2017-01-31 | Light Labs Inc. | Methods and apparatus for implementing a camera device supporting a number of different focal lengths |
US9749511B2 (en) | 2013-10-18 | 2017-08-29 | Light Labs Inc. | Methods and apparatus relating to a camera including multiple optical chains |
US10509208B2 (en) * | 2013-10-18 | 2019-12-17 | Light Labs Inc. | Methods and apparatus for implementing and/or using a camera device |
US9557520B2 (en) | 2013-10-18 | 2017-01-31 | Light Labs Inc. | Synchronized image capture methods and apparatus |
US9544501B2 (en) | 2013-10-18 | 2017-01-10 | Light Labs Inc. | Methods and apparatus for implementing and/or using a camera device |
US9551854B2 (en) | 2013-10-18 | 2017-01-24 | Light Labs Inc. | Methods and apparatus for controlling sensors to capture images in a synchronized manner |
US9736365B2 (en) | 2013-10-26 | 2017-08-15 | Light Labs Inc. | Zoom related methods and apparatus |
US9686471B2 (en) | 2013-11-01 | 2017-06-20 | Light Labs Inc. | Methods and apparatus relating to image stabilization |
US20150163393A1 (en) * | 2013-12-06 | 2015-06-11 | Jwin Electronics Corp. | Remote control built-in device case for electronic device |
US9554031B2 (en) | 2013-12-31 | 2017-01-24 | Light Labs Inc. | Camera focusing related methods and apparatus |
US10931866B2 (en) | 2014-01-05 | 2021-02-23 | Light Labs Inc. | Methods and apparatus for receiving and storing in a camera a user controllable setting that is used to control composite image generation performed after image capture |
US20150241713A1 (en) * | 2014-02-21 | 2015-08-27 | The Lightco Inc. | Camera holder methods and apparatus |
US9979878B2 (en) | 2014-02-21 | 2018-05-22 | Light Labs Inc. | Intuitive camera user interface methods and apparatus |
US20150288857A1 (en) * | 2014-04-07 | 2015-10-08 | Microsoft Corporation | Mount that facilitates positioning and orienting a mobile computing device |
CN106164562A (en) * | 2014-04-07 | 2016-11-23 | 微软技术许可有限责任公司 | Promote that location and displacement calculate the bracket of equipment |
US11927877B2 (en) * | 2014-04-28 | 2024-03-12 | SZ DJI Technology Co., Ltd. | Interchangeable mounting platform |
US20220057695A1 (en) * | 2014-04-28 | 2022-02-24 | SZ DJI Technology Co., Ltd. | Interchangeable mounting platform |
US9715900B2 (en) | 2014-05-15 | 2017-07-25 | Showbox Ltd. | Methods, circuits, devices, systems and associated computer executable code for composing composite content |
WO2015173828A1 (en) * | 2014-05-15 | 2015-11-19 | Showbox Ltd. | Methods, circuits, devices, systems and associated computer executable code for composing composite content |
USD732545S1 (en) | 2014-06-06 | 2015-06-23 | Padcaster Llc | Tablet holder |
US10191356B2 (en) | 2014-07-04 | 2019-01-29 | Light Labs Inc. | Methods and apparatus relating to detection and/or indicating a dirty lens condition |
US10110794B2 (en) | 2014-07-09 | 2018-10-23 | Light Labs Inc. | Camera device including multiple optical chains and related methods |
US10198865B2 (en) | 2014-07-10 | 2019-02-05 | Seiko Epson Corporation | HMD calibration with direct geometric modeling |
US9074847B1 (en) * | 2014-08-28 | 2015-07-07 | Flex Force Enterprises LLC | Stabilized weapon platform with active sense and adaptive motion control |
US9723200B2 (en) | 2014-10-15 | 2017-08-01 | Microsoft Technology Licensing, Llc | Camera capture recommendation for applications |
WO2016061011A3 (en) * | 2014-10-15 | 2016-06-09 | Microsoft Technology Licensing, Llc | Camera capture recommendation for applications |
US9912864B2 (en) | 2014-10-17 | 2018-03-06 | Light Labs Inc. | Methods and apparatus for using a camera device to support multiple modes of operation |
US9912865B2 (en) | 2014-10-17 | 2018-03-06 | Light Labs Inc. | Methods and apparatus for supporting burst modes of camera operation |
US9998638B2 (en) | 2014-12-17 | 2018-06-12 | Light Labs Inc. | Methods and apparatus for implementing and using camera devices |
US9544503B2 (en) | 2014-12-30 | 2017-01-10 | Light Labs Inc. | Exposure control methods and apparatus |
US9458963B1 (en) * | 2015-03-31 | 2016-10-04 | Swiftcam Technologies Group Company Limited | 360-degree gimbal system |
US9824427B2 (en) | 2015-04-15 | 2017-11-21 | Light Labs Inc. | Methods and apparatus for generating a sharp image |
US9857584B2 (en) | 2015-04-17 | 2018-01-02 | Light Labs Inc. | Camera device methods, apparatus and components |
US9967535B2 (en) | 2015-04-17 | 2018-05-08 | Light Labs Inc. | Methods and apparatus for reducing noise in images |
US10075651B2 (en) | 2015-04-17 | 2018-09-11 | Light Labs Inc. | Methods and apparatus for capturing images using multiple camera modules in an efficient manner |
US10091447B2 (en) | 2015-04-17 | 2018-10-02 | Light Labs Inc. | Methods and apparatus for synchronizing readout of multiple image sensors |
EP3086016A1 (en) * | 2015-04-22 | 2016-10-26 | Novona AG | Motorized camera holder |
US9930233B2 (en) | 2015-04-22 | 2018-03-27 | Light Labs Inc. | Filter mounting methods and apparatus and related camera apparatus |
US10192133B2 (en) | 2015-06-22 | 2019-01-29 | Seiko Epson Corporation | Marker, method of detecting position and pose of marker, and computer program |
US10296805B2 (en) | 2015-06-22 | 2019-05-21 | Seiko Epson Corporation | Marker, method of detecting position and pose of marker, and computer program |
US10129483B2 (en) | 2015-06-23 | 2018-11-13 | Light Labs Inc. | Methods and apparatus for implementing zoom using one or more moveable camera modules |
US20170006205A1 (en) * | 2015-06-30 | 2017-01-05 | Guilin Feiyu Technology Corporation Ltd. | Stabilizer applicable for moving shooting |
US10242504B2 (en) | 2015-07-06 | 2019-03-26 | Seiko Epson Corporation | Head-mounted display device and computer program |
US10192361B2 (en) | 2015-07-06 | 2019-01-29 | Seiko Epson Corporation | Head-mounted display device and computer program |
US9800778B2 (en) * | 2015-07-21 | 2017-10-24 | Qualcomm Incorporated | Camera orientation notification system |
US10491806B2 (en) | 2015-08-03 | 2019-11-26 | Light Labs Inc. | Camera device control related methods and apparatus |
US10365480B2 (en) | 2015-08-27 | 2019-07-30 | Light Labs Inc. | Methods and apparatus for implementing and/or using camera devices with one or more light redirection devices |
US9738234B2 (en) * | 2015-09-17 | 2017-08-22 | ScienBiziP Consulting(Shenzhen)Co., Ltd. | Fastening apparatus |
US20170080875A1 (en) * | 2015-09-17 | 2017-03-23 | Hon Hai Precision Industry Co., Ltd. | Fastening apparatus |
US10051182B2 (en) | 2015-10-05 | 2018-08-14 | Light Labs Inc. | Methods and apparatus for compensating for motion and/or changing light conditions during image capture |
US9749549B2 (en) | 2015-10-06 | 2017-08-29 | Light Labs Inc. | Methods and apparatus for facilitating selective blurring of one or more image portions |
US20170161956A1 (en) * | 2015-12-02 | 2017-06-08 | Seiko Epson Corporation | Head-mounted display device and computer program |
US10347048B2 (en) * | 2015-12-02 | 2019-07-09 | Seiko Epson Corporation | Controlling a display of a head-mounted display device |
US10363663B2 (en) | 2015-12-15 | 2019-07-30 | Samsung Electronics Co., Ltd. | Electronic device and cradle thereof |
EP3329347A4 (en) * | 2015-12-15 | 2018-10-24 | Samsung Electronics Co., Ltd. | Electronic device and cradle thereof |
CN108351681A (en) * | 2015-12-15 | 2018-07-31 | 三星电子株式会社 | Electronic equipment and its bracket |
US10225445B2 (en) | 2015-12-18 | 2019-03-05 | Light Labs Inc. | Methods and apparatus for providing a camera lens or viewing point indicator |
US10003738B2 (en) | 2015-12-18 | 2018-06-19 | Light Labs Inc. | Methods and apparatus for detecting and/or indicating a blocked sensor or camera module |
US11184548B2 (en) * | 2015-12-22 | 2021-11-23 | Sz Dji Osmo Technology Co., Ltd. | Imaging device, and method and apparatus for controlling the imaging device |
WO2017117039A1 (en) * | 2015-12-28 | 2017-07-06 | Sphere Optics Company, Llc | Omnidirectional catadioptric lens with odd aspheric contour or multi-lens |
US11137582B2 (en) | 2015-12-28 | 2021-10-05 | Spere Optics Company, Llc | Omnidirectional catadioptric lens with odd aspheric contour or multi-lens |
US11201990B2 (en) | 2016-02-26 | 2021-12-14 | Freedom Scientific, Inc. | Magnifier device with adjustable camera |
US10511749B2 (en) | 2016-02-26 | 2019-12-17 | Freedom Scientific, Inc. | Magnifier device with adjustable camera |
US10306218B2 (en) | 2016-03-22 | 2019-05-28 | Light Labs Inc. | Camera calibration apparatus and methods |
US9800786B1 (en) * | 2016-04-20 | 2017-10-24 | Guilin Feiyu Technology Corporation Ltd. | Shooting apparatus with stabilizer module |
US20170310868A1 (en) * | 2016-04-20 | 2017-10-26 | Guilin Feiyu Technology Corporation Ltd. | Shooting apparatus with stabilizer module |
US11568626B2 (en) | 2016-06-06 | 2023-01-31 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
US10902609B2 (en) | 2016-06-06 | 2021-01-26 | Sz Dji Osmo Technology Co., Ltd. | Image processing for tracking |
US11106928B2 (en) * | 2016-06-06 | 2021-08-31 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
US9894278B2 (en) * | 2016-06-08 | 2018-02-13 | Guilin Feiyu Technology Corporation Ltd. | Stabilizer applicable for moving shooting |
CN106131402A (en) * | 2016-06-29 | 2016-11-16 | 广东欧珀移动通信有限公司 | A kind of self-shooting bar and self-heterodyne system |
US10290111B2 (en) | 2016-07-26 | 2019-05-14 | Qualcomm Incorporated | Systems and methods for compositing images |
CN109478317A (en) * | 2016-07-26 | 2019-03-15 | 高通股份有限公司 | System and method for composograph |
WO2018022197A1 (en) * | 2016-07-26 | 2018-02-01 | Qualcomm Incorporated | Systems and methods for compositing images |
US10582106B2 (en) * | 2016-07-27 | 2020-03-03 | Guangdong Sirui Optical Co., Ltd. | Intelligent ball head and method for performing self-photographing by using the same |
US20190158719A1 (en) * | 2016-07-27 | 2019-05-23 | Guangdong Sirui Optical Co., Ltd. | Intelligent ball head and method for performing self‐photographing by using the same |
US10348970B2 (en) * | 2016-08-12 | 2019-07-09 | Lg Electronics Inc. | Mobile terminal and method of operating the same |
US10284823B1 (en) * | 2016-10-12 | 2019-05-07 | Google Llc | Capturing photographs based on rotation of device |
US11163289B2 (en) * | 2017-02-24 | 2021-11-02 | Sharp Kabushiki Kaisha | Control device, terminal device, cradle, notification system, control method, and storage medium |
US10348874B2 (en) * | 2017-05-19 | 2019-07-09 | Conor Penfold | System and method for improving a photographic camera feature on a portable electronic device |
US11653095B2 (en) | 2018-01-05 | 2023-05-16 | Gopro, Inc. | Modular image capture systems |
USD991315S1 (en) | 2018-01-05 | 2023-07-04 | Gopro, Inc. | Camera |
CN111742544A (en) * | 2018-01-05 | 2020-10-02 | 高途乐公司 | Modular image capture system |
USD992619S1 (en) | 2018-01-05 | 2023-07-18 | Gopro, Inc. | Camera |
US11283982B2 (en) | 2019-07-07 | 2022-03-22 | Selfie Snapper, Inc. | Selfie camera |
US11770607B2 (en) | 2019-07-07 | 2023-09-26 | Selfie Snapper, Inc. | Electroadhesion device |
US11901841B2 (en) | 2019-12-31 | 2024-02-13 | Selfie Snapper, Inc. | Electroadhesion device with voltage control module |
WO2021252960A1 (en) * | 2020-06-12 | 2021-12-16 | Selfie Snapper, Inc. | Robotic arm camera |
US20210387347A1 (en) * | 2020-06-12 | 2021-12-16 | Selfie Snapper, Inc. | Robotic arm camera |
USD939607S1 (en) | 2020-07-10 | 2021-12-28 | Selfie Snapper, Inc. | Selfie camera |
US11727657B2 (en) | 2021-04-09 | 2023-08-15 | Carvana, LLC | Systems and methods for rendering a portion of a 3D display |
US11719382B2 (en) * | 2021-05-28 | 2023-08-08 | Peak Design | Mobile tripod mounting system |
US20220381394A1 (en) * | 2021-05-28 | 2022-12-01 | Peak Design | Mobile tripod mounting system |
US20230199316A1 (en) * | 2021-12-17 | 2023-06-22 | Matterport | Motor mount for image capture of surrounding environment |
US11962896B2 (en) | 2022-10-31 | 2024-04-16 | Gopro, Inc. | Positioning apparatus for photographic and video imaging and recording and system utilizing the same |
Also Published As
Publication number | Publication date |
---|---|
US9213220B2 (en) | 2015-12-15 |
WO2011127201A1 (en) | 2011-10-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9213220B2 (en) | Camera control | |
JP6106764B2 (en) | Imaging apparatus and time-lapse imaging method | |
CN101046623B (en) | Apparatus and method for taking panoramic photograph | |
WO2017067520A1 (en) | Mobile terminal having binocular cameras and photographing method therefor | |
US9894274B2 (en) | Digital photographing apparatus, methods of controlling the same, and computer-readable storage medium to increase success rates in panoramic photography | |
AU2012352520B2 (en) | Multiple-angle imagery of physical objects | |
US20130258122A1 (en) | Method and device for motion enhanced image capture | |
CN109076161B (en) | Image processing method, mobile platform, control equipment and system | |
US9742995B2 (en) | Receiver-controlled panoramic view video share | |
US10051180B1 (en) | Method and system for removing an obstructing object in a panoramic image | |
JP6302564B2 (en) | Movie editing apparatus, movie editing method, and movie editing program | |
CN102207674A (en) | Panorama image shooting apparatus and method | |
KR20160144414A (en) | Mount that facilitates positioning and orienting a mobile computing device | |
JP2011146957A (en) | Imaging apparatus, control method thereof, and program | |
US20160088230A1 (en) | Systems and methods for camera operation through control device | |
EP2645700A1 (en) | Method and device for motion enhanced image capture | |
US20170111574A1 (en) | Imaging apparatus and imaging method | |
JP6107518B2 (en) | Information processing apparatus, information processing method, and program | |
US20220182551A1 (en) | Display method, imaging method and related devices | |
WO2017015959A1 (en) | Method, control device and control system for controlling mobile device to photograph | |
JP2017162371A (en) | Image processing device, image processing method and program | |
US20150189151A1 (en) | Information processing apparatus, imaging apparatus, information processing method, information processing program, and imaging system | |
CN109155820B (en) | Spherical panorama synthesis method, unmanned aerial vehicle system, unmanned aerial vehicle, terminal and control method thereof | |
JP2018007041A (en) | Imaging apparatus, display device, and imaging and display system | |
WO2022041013A1 (en) | Control method, handheld gimbal, system, and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: YOUBIQ, LLC, MASSACHUSETTS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FOWLER, GORDON;YONKERS, E. HUBBARD;MEDNIEKS, ZIGURD;SIGNING DATES FROM 20111117 TO 20111118;REEL/FRAME:027943/0617 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20191215 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES FILED (ORIGINAL EVENT CODE: PMFP); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Free format text: SURCHARGE, PETITION TO ACCEPT PYMT AFTER EXP, UNINTENTIONAL (ORIGINAL EVENT CODE: M3558); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, MICRO ENTITY (ORIGINAL EVENT CODE: M3551); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 4 |
|
PRDP | Patent reinstated due to the acceptance of a late maintenance fee |
Effective date: 20210813 |
|
FEPP | Fee payment procedure |
Free format text: PETITION RELATED TO MAINTENANCE FEES GRANTED (ORIGINAL EVENT CODE: PMFG); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 8 |